US20080294588A1 - Event capture, cross device event correlation, and responsive actions - Google Patents
Event capture, cross device event correlation, and responsive actions Download PDFInfo
- Publication number
- US20080294588A1 US20080294588A1 US12/125,115 US12511508A US2008294588A1 US 20080294588 A1 US20080294588 A1 US 20080294588A1 US 12511508 A US12511508 A US 12511508A US 2008294588 A1 US2008294588 A1 US 2008294588A1
- Authority
- US
- United States
- Prior art keywords
- event
- rules
- rule
- situation
- actions
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
- G06V40/173—Classification, e.g. identification face re-identification, e.g. recognising unknown faces across different face tracks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
Definitions
- Guards are expensive to pay, and also capable of error. Particularly for larger facilities/facilities with many possible points of entry, it may not be possible to hire enough guards to “watch” everything going on. Thus, automated devices such as security cameras have also been added to the mix of security measures. The addition of security cameras meant that security personnel could “see” all of the interesting areas of the facility (i.e., points of entry, locations where things of valued were stored, etc.). However, an increase in the number of cameras placed into a facility made it harder to watch all of the cameras simultaneously without hiring more personnel to watch the cameras. Doing so would remove the primary monetary advantage of using cameras, that is, not having to employ more personnel.
- Situational awareness includes an ability to take into account a variety of events that occur in and around a protected area or facility, including simultaneous or near-simultaneous events.
- situational awareness may include, in some situations, being aware that: multiple people have gone through a door on a single badge swipe, through a combination of access control mechanism and video analytics); multiple doors are being tried, through door jiggle sensors; or perhaps something in a protected facility/area has gone out of an allowable range (i.e., level of fluid in a tank, acidity of a substance, etc.).
- an allowable range i.e., level of fluid in a tank, acidity of a substance, etc.
- Situational awareness as provided by embodiments described herein may include a number of phases.
- a first phase responses to individual alerts, or “singletons”, occur.
- a singleton is a single event that causes one or more actions to take place, such as flipping cameras, sounding alarms, etc. Combining singletons results in more complicated scenarios that require a rules engines and correlation between events, as described herein.
- a second phase of situational awareness may be realizing that there is much more information to be mined by combining the results of many different types of events and correlating them. For example, it is not a particularly interesting event to know that someone has just logged into a computer located within a building, nor is it a particularly interesting event to know that someone has left the building. However, it is very interesting to know that the same person that just logged into the computer located within the building is also reported to not be in the building.
- embodiments focus security personnel on unusual events, instead of the security personnel waiting for an external event (i.e., a phone call) to occur to warn that a problem is in progress.
- an external event i.e., a phone call
- Embodiments of the invention thus include a rules engine used for correlating the events that come in from various statuses/alerting systems.
- the rules engine does not attempt to use any sort of Artificial Intelligence (AI) to learn what is normal, or to learn what is not normal. Rather, embodiments require that creators of rules, such as the programmers (system integrators) of the rules engine, determine what is not “normal”, and create rules to detect those “abnormal” situations. Note that defining anything that is “abnormal” or otherwise unusual is not required. Rather, serious thought about the normal course of events is required. Those thoughts may then be embodied as rules within the rules engine by programming them into the engine, and then looking for patterns of actual events that are counter to those rules. Using the example from above, the pattern of a person being noted to have left a building but then logging in to a computer that is supposed to be located within the building, is noticed.
- AI Artificial Intelligence
- Embodiments of the invention thus use events coming into the system and the rules that correlate them to provide situational awareness.
- the actions that are taken as a result and the policies that dictate how this is all acted upon may be varied from customer to customer.
- embodiments provide a broad and flexible security system that may easily be modified to meet a particular customer's needs.
- a method of correlating security event data according to one or more rules to initiate one or more actions includes receiving security event data via an event pipeline, security event data defining an occurrence of an event, wherein each type of source of event data may have its own pipeline, and creating event objects from the received security event data.
- the method also includes gathering the event objects into a rules engine, the rules engine may include a plurality of rules, wherein each rule defines one or more conditions to be met and one or more actions to be taken in response.
- the method also includes evaluating a rule from the plurality of rules within the rules engine at a frequency over a period of time using data contained within event objects, so to determine whether the one or more conditions defined within the evaluated rule are met, and initiating the one or more actions defined by the evaluated rule whenever the one or more conditions defined by that rule are met.
- creating may include pre-processing the received security event data to create event objects that include the received security event data in a canonical form.
- pre-processing may include supplementing an event object with related data not acquired from the source of the received security event data, by using an item of data within the received security event data to look up the related data, the related data used to set one or more attributes of the event object.
- the method may include prior to evaluating a rule, injecting one or more facts into rules engine, the one or more facts used to evaluate one or more rules.
- the method may include assigning weights to a group of rules within the rules engine, such that when the one or more conditions of a rule in the group of rules are met, a value of the weight assigned to that rule is added to an accumulator; including an accumulator rule in the rules engine, wherein a condition of the accumulator rule is defined as the values in the accumulator exceeding a given value, the accumulator rule defining one or more actions to occur upon the condition being met; evaluating the accumulator rule at a frequency over a period of time; and upon the condition of the accumulator rule being met, triggering the one or more actions defined in the accumulator rule to occur.
- the method may include identifying, within the rules engine, a situation, wherein a situation is the occurrence of one or more events; creating, in a situation manager, a situation object, the situation object including data from one or more event objects that describe the one or more events, the included data identifying the one or more events that are occurring that make up the situation, the situation further including one or more actions to be taken upon occurrence of the situation; and injecting the situation object from the situation manager into the rules engine.
- the rules engine may include one or more rules where a first condition to be met is that a situation is occurring, and where a second condition to be met is that one or more further events occur, and the method may include, upon occurrence of the one or more further events, changing the situation object corresponding to the situation by adding the one or more further events to the situation object.
- the method may include, in response to adding the one or more further events to the situation object, further changing the situation object by changing the one or more actions to be taken upon occurrence of the situation, the change in the one or more actions based upon the occurrence of the one or more further events.
- initiating may include initiating the one or more actions defined by the evaluated rule whenever the one or more conditions defined by that rule are met, wherein one of the one or more actions includes intelligent controlling of a video device based on received security event data.
- initiating may include creating an output package to cause one or more types of a device to execute one or more steps, the device and the one or more steps it is to execute determined from the one or more actions defined by the evaluated rule, wherein the output package is standardized for each type of the device; and transmitting the created output package to the one or more types of the device, where upon receipt of its created output package, each of the one or more types of the device executes the one or more steps included in the output package.
- a computer program product stored on computer readable medium, for correlating security event data according to one or more rules to initiate one or more actions.
- the computer program product includes computer program code for receiving security event data via an event pipeline, security event data defining an occurrence of an event, wherein each type of source of event data may have its own pipeline, and computer program code for creating event objects from the received security event data.
- the computer program product also includes computer program code for gathering the event objects into a rules engine, the rules engine comprising a plurality of rules, wherein each rule defines one or more conditions to be met and one or more actions to be taken in response, computer program code for evaluating a rule from the plurality of rules within the rules engine at a frequency over a period of time using data contained within event objects, so to determine whether the one or more conditions defined within the evaluated rule are met, and computer program code for initiating the one or more actions defined by the evaluated rule whenever the one or more conditions defined by that rule are met.
- a computer system in another embodiment, there is provided a computer system.
- the computer system includes a memory; a processor; a network interface; and an interconnection mechanism coupling the memory, the processor, and the network interface, allowing communication there between.
- the memory is encoded with an event and rules correlating application, that when executed in the processor, provides a an event and rules correlating process that correlates security event data according to one or more rules to initiate one or more actions.
- the event and rules correlating process causes the computer system to perform the operations of: receiving security event data via an event pipeline, security event data defining an occurrence of an event, wherein each type of source of event data may have its own pipeline; creating event objects from the received security event data; gathering the event objects into a rules engine, the rules engine comprising a plurality of rules, wherein each rule defines one or more conditions to be met and one or more actions to be taken in response; evaluating a rule from the plurality of rules within the rules engine at a frequency over a period of time using data contained within event objects, so to determine whether the one or more conditions defined within the evaluated rule are met; and initiating the one or more actions defined by the evaluated rule whenever the one or more conditions defined by that rule are met.
- a computer program product is one embodiment that has a computer-readable medium including computer program logic encoded thereon that when performed in a computerized device provides associated operations providing client management of download sequence of orchestrated content as explained herein.
- the computer program logic when executed on at least one processor with a computing system, causes the processor to perform the operations (e.g., the methods) indicated herein as embodiments of the invention.
- Such arrangements of the invention are typically provided as software, code and/or other data structures arranged or encoded on a computer readable medium such as but not limited to an optical medium (e.g., CD-ROM, DVD-ROM, etc.), floppy or hard disk, a so-called “flash” (i.e., solid state) memory medium, or other physical medium, such as but not limited to firmware or microcode in one or more ROM or RAM or PROM chips, or as an Application Specific Integrated Circuit (ASIC), or as downloadable software images in one or more modules, shared libraries, etc.
- the software or firmware or other such configurations can be installed onto a computerized device to cause one or more processors in the computerized device to perform the techniques explained herein as embodiments of the invention.
- Software processes that operate in a collection of computerized devices, such as in a group of data communications devices or other entities may also provide the system of the invention.
- the system of the invention may be distributed between many software processes on several data communications devices, or all processes may run on a small set of dedicated computers, or on one computer alone.
- embodiments of the invention may be embodied strictly as a software program, as software and hardware, or as hardware and/or circuitry alone.
- the features disclosed and explained herein may be employed in computerized devices and software systems for such devices such as those manufactured by VidSys, Inc., of Vienna, Va.
- FIG. 1 shows a high-level block diagram of a computer system according to one embodiment disclosed herein.
- FIGS. 2A-2C illustrate examples of system structures upon which a procedure performed by the system of FIG. 1 may operate.
- FIGS. 3A-3E illustrate example code of various components of embodiments described by FIGS. 4-7 .
- FIG. 4 illustrates a flowchart of a procedure performed by the system of FIG. 1 when correlating events and rules so as to initiate actions related to security procedures.
- FIG. 5 illustrates a flowchart of a procedure performed by the system of FIG. 1 when assigning weights to various events such that, as weights accumulate, representing a group of events occurring in a period of time, particular actions are taken.
- FIG. 6 illustrates a flowchart of a procedure performed by the system of FIG. 1 when addressing situations and providing a situation object back to the rules engine so that the object is able to be modified over time as the situation changes.
- FIG. 7 illustrates a flowchart of a procedure performed by the system of FIG. 1 when initiating actions through use of commands sent to devices.
- disclosed embodiments provide methods and apparatus for correlating events to rules and in response, taking one or more actions.
- Data pertaining to events is continually received over time from a variety of sources, such as video cameras, badge readers, sensors, RFID tags, and so on. This data is collected and normalized into a standard format for processing. Data may be deemed unimportant and thus left out, or otherwise filtered out for any number of reasons.
- the data is processed into structures that may be supplemented with additional related data.
- the structures are then loaded into a rules engine that contains a plurality of rules defining different event scenarios (i.e., singleton events or combinations of events) that may occur, as well as one or more actions to be taken in response to the occurrence of the scenario(s) defined in the rule(s).
- the rules are continually evaluated over a period of time, and whenever the condition(s) for a rule is(are) met, the action(s) associated with that rule is(are) taken.
- FIG. 1 is a block diagram illustrating example architecture of a computer system 110 that executes, runs, interprets, operates or otherwise performs an event and rules correlation application 140 - 1 and an event and rules correlation process 140 - 2 suitable for use in explaining example configurations disclosed herein.
- the computer system 110 may be any type of computerized device such as a personal computer, workstation, portable computing device, console, laptop, network terminal or the like.
- the computer system 110 includes an interconnection mechanism 111 such as a data bus or other circuitry that couples a memory system 112 , a processor 113 , an input/output interface 114 , and a communications interface 115 .
- An input device 116 (e.g., one or more user controlled devices such as a keyboard, mouse, touchpad, trackball, etc.) couples to the processor 113 through the I/O interface 114 and enables a user 108 such as a web page developer to provide input commands and generally control a graphical user interface 160 shown on a display 130 .
- the communications interface 115 enables the computer system 110 to communicate with other devices (e.g., other computers) on a network (not shown in FIG. 1 ).
- the memory system 112 is any type of computer readable medium and in this example is encoded with an event and rules correlation application 140 - 1 .
- the event and rules correlation application 140 - 1 may be embodied as software code such as data and/or logic instructions (e.g., code stored in the memory or on another computer readable medium such as a removable disk) that supports processing functionality according to different embodiments described herein.
- the processor 113 accesses the memory system 112 via the interconnection mechanism 111 in order to launch, run, execute, interpret or otherwise perform the logic instructions of the event and rules correlation application 140 - 1 .
- Execution of the event and rules correlation application 140 - 1 in this manner produces processing functionality in an event and rules correlation process 140 - 2 .
- the event and rules correlation process 140 - 2 represents one or more portions or runtime instances of the event and rules correlation application 140 - 1 performing or executing within or upon the processor 113 in the computer system 110 at runtime.
- example configurations disclosed herein include the event and rules correlation application 140 - 1 itself including the event and rules correlation process 140 - 2 (i.e., in the form of un-executed or non-performing logic instructions and/or data).
- the event and rules correlation application 140 - 1 may be stored on a computer readable medium (such as a floppy disk), hard disk, electronic, magnetic, optical or other computer readable medium.
- the event and rules correlation application 140 - 1 may also be stored in a memory system 112 such as in firmware, read only memory (ROM), or, as in this example, as executable code in, for example, Random Access Memory (RAM).
- ROM read only memory
- RAM Random Access Memory
- the display 130 need not be coupled directly to computer system 110 .
- the event and rules correlation application 140 - 1 may be executed on a remotely accessible computerized device via the network interface 115 .
- the graphical user interface 160 may be displayed locally to a user of the remote computer and execution of the processing herein may be client-server based.
- the graphical user interface 160 may be a customer interface through which a user, such as the user 108 , is able to learn various information and take various actions.
- the amount of features, and control thereof, may depend on a user level, such that a basic user has access to only a certain amount of features, while an administrator may have access to all available features.
- Key features of the graphical user interface 160 may include the ability to locate and activate cameras easily from a searchable directory; the ability to locate and activate cameras easily from a map representation; the ability to control Cameras and NDVRs to effect Pan-Tilt-Zoom (PTZ), Iris, Focus, Playback, etc.; the ability to list and get details on incoming events; the ability to locate people and use collaboration tools to interact with them; the ability to locate shared folders and reference and active files; the ability to control video walls; the ability to initiate video tours, which are described in detail in co-pending U.S. patent application Ser. No. ______, entitled “INTELLIGENT VIDEO TOURS” and given Attorney Docket VID08-04; and administrative features such as but not limited to create/modify/list/delete users, roles, tours, devices, etc.
- PTZ Pan-Tilt-Zoom
- FIGS. 2A-2C illustrate examples of system structures upon which the event and rules correlating process 140 - 2 may operate.
- FIGS. 3A-3E include example code of various components of embodiments described below, and are referenced throughout the discussion of FIGS. 4-7 .
- FIGS. 4-7 are flowcharts of various embodiments of the event and rules correlating process 140 - 2 .
- the rectangular elements are herein denoted “processing blocks” and represent computer software instructions or groups of instructions. Alternatively, the processing blocks represent steps performed by functionally equivalent circuits such as a digital signal processor circuit or an application specific integrated circuit (ASIC).
- ASIC application specific integrated circuit
- FIG. 4 illustrates an embodiment of the event and rules correlating application 140 - 1 , executing as the event and rules correlating process 140 - 2 , to correlate security event data according to one or more rules to initiate one or more actions.
- the event and rules correlating process 140 - 2 first receives security event data via an event pipeline, step 401 .
- Security event data defines an occurrence of an event.
- An event is simply something that takes place and is, in some manner, recorded or otherwise noticed by a device to which the event and rules correlating process 140 - 2 has access.
- the security event data may include the id number of the badge and the location of the badge reader.
- the event would be using the security badge.
- a further event may be the opening and closing of the door, if there is a sensor on the door capable of noticing that event.
- the event data for such a thing would be a reading from the sensor, first indicating the opening of the door, and then indicating the shutting of the door.
- the number of devices that may serve as sources of event data for the event and rules correlating process 140 - 2 is large.
- the event and rules correlating process 140 - 2 is able to interface with any number of security-related devices (video cameras, sensors, lighting systems, badge readers, electronic locks, etc.). Those devices differ, not only in function, but the format of the data they provide and how they are controlled. Thus, the event and rules correlating process 140 - 2 must be flexible enough to deal with hundreds, perhaps thousands, of different data formats.
- the event and rules correlating process 140 - 2 achieves this flexibility through use of an input pipeline.
- the layers may be thought of as an externally defined customizable stack 210 of various layers. Each layer in the stack 210 is relatively simple and is configured to do a single or small set of functions.
- the layers may include a transport layer 212 , a parser layer 214 , and a normalize layer 216 .
- the layers may also include optional log layers 218 and 220 , and an optional filtering layer 219 .
- each type of source of event data has its own pipeline. That is, cameras may have their own pipeline (not necessarily for video but rather, for example, for analytic data), while sensors may have their own pipeline, badge readers may have their own pipeline, and so on.
- a type of source need not be simply the different devices that deliver event data to the event and rules correlating process 140 - 2 , but may also include variations within those devices.
- a first badge reader manufactured by one company may have a different data format and control mechanism than a second badge reader manufactured by another company.
- the event and rules correlating process 140 - 2 thus has a pipeline for the first badge reader (and any/all other badge readers of the same type) and a separate pipeline for the second badge reader (and any/all other badge readers of the same type).
- different types of sources of event data may share a pipeline as is appropriate.
- the event and rules correlating process 140 - 2 through the pipeline on its external side, communicates with the source of the event data (i.e., a device) using the language and transport (i.e. protocol) that is defined by the source.
- the source of the event data i.e., a device
- transport i.e. protocol
- examples of transports may include but are not limited to HTTP, File, SNMP, SMTP, SOAP, TCP, and the like.
- the transport layer 212 takes event data in and provides it as raw event data to the parser layer 214 .
- the event data, after coming out of a transport layer, is essentially an unpackaged byte stream.
- the parser layer 214 may include rules for pulling fields from XML, or running regular expressions, or busting apart ASCII strings into a particular form.
- the parser layer 214 takes the raw event data and its attributes, as delivered by the transport layer 212 , and converts that raw event data into a structure that may be operated on by the remaining layers of the pipeline. Note that the parser layer 214 does not understand or know, or make judgments on, the raw event data received by the event and rules correlating process 140 - 2 . Rather, the parser layer 214 does a format specific conversion from the incoming byte stream (i.e., raw event data) to an implementation-specific format for the remainder of the pipeline.
- the parser layer 214 might convert those to the internal format, as perhaps an XML document.
- the parser layer 214 may pass the ones provided along, but add to it the transport specific fields, original URL, arrival time, host, etc.
- the event and rules correlating process 140 - 2 next creates event objects from the received security event data, step 402 .
- An event object is a data structure that may, and typically does, include one or more attributes. That is, an event object may be thought of a fact, with other related information included as well. In general, the more facts the event and rules correlating process 140 - 2 has, the better.
- the raw security event data received by the event and rules correlating process 140 - 2 may be in any one of a number of formats, for the event and rules correlating process 140 - 2 to easily use that data, the event and rules correlating process 140 - 2 must first place it into a standardized format.
- the event and rules correlating process 140 - 2 may create an event object by pre-processing the received security event data to create event objects that include the received security event data in a canonical form, step 406 .
- the pre-processing may include transforming, combining, and/or deleting fields or even events, among other actions.
- the event and rules correlating process 140 - 2 performs this pre-processing (and thus the creation of the event object) in at least the normalizer layer 216 .
- the event object created by the event and rules correlating process 140 - 2 for that event data would include the badge id.
- the event object may also include related attributes, such as information that identifies the location of the badge reader, the owner of the badge with that badge id, and so on.
- the related attributes do not come from the source of the event data, but rather must be acquired from another source or sources.
- the event and rules correlating process 140 - 2 may supplement an event object with related data not acquired from the source of the received security event data, by using an item of data within the received security event data to look up the related data, the related data used to set one or more attributes of the event object, step 407 .
- the event and rules correlating process 140 - 2 may use the badge id to perform a lookup in a database to determine who the owner of the badge id is, and may place this information in an attribute of the event object.
- the event and rules correlating process 140 - 2 may use a first attribute to acquire other attributes.
- the event and rules correlating process 140 - 2 may use the name of the owner of the badge with the received badge id to lookup that person's permissions for entering various restricted areas of a protected facility.
- the event and rules correlating process 140 - 2 may also set those permissions as attributes within the event object, or in a separate event object.
- the event and rules correlating process 140 - 2 may use the optional filtering layer 219 to remove unnecessary data from an event object. For example, data from a sensor reporting its current value may arrive five times in a short duration of time (say, 10 seconds). The event and rules correlating process 140 - 2 may use the filtering layer 219 to determine that, because the value reported by the sensor is within an accepted range of error, the sensor has essentially reported the same value five times, and the event object need not contain that value five times over; once will suffice. Thus, the event and rules correlating process 140 - 2 may be programmed to store or otherwise note a current value of data for later comparisons.
- the event and rules correlating process 140 - 2 may use the filtering layer 219 to only remove certain pieces of data. For example, the same sensor may, over a ten minute period of time, occasionally report the same value, but viewed over the ten minutes, the value reported by the sensor shows a steady increase.
- the event and rules correlating process 140 - 2 may use the filtering layer 219 to remove only unnecessary repeats of a value, to thus make sure the event object contains the values showing that increase over time. In such a situation, the event and rules correlating process 140 - 2 is thus configured to not only maintain a current value, but also a number of recent values as well.
- an input pipeline as used by the event and rules correlating process 140 - 2 is generic, such that it may be extended to support more devices by adding support to the event and rules correlating process 140 - 2 for transports, parsers, and transforms for those devices.
- sources of event data i.e., devices
- custom transports and/or customer parsers may have to be developed.
- event data in a pipeline may be logged at any time, by using either of the two optional log layers 218 and/or 220 .
- the event and rules correlating process 140 - 2 then gathers the event objects into a rules engine, step 403 .
- the rules engine comprises a plurality of rules. Each rule defines one or more conditions to be met and one or more actions to be taken in response. Example rules are shown in FIGS. 3B-3E . Rules are loaded from an external file and are thus not hard coded. Rules may be created by users, in some embodiments as XML files, or via a plug-in that supports a semi-natural language way to create syntax for proper rules, or via a graphical user interface. Some rules represent singleton events, that is, only one condition need occur. For example, a rule may be defined to send an alert message when a device including an RFID tag crosses a threshold point.
- rules may represent multiple events of varying complexity. Thus, such a rule may require that a device including an RFID tag crosses a threshold point and that prior to this, a badge was used to open the door at the threshold point. Patterns may important in some rules, while order may be important in others.
- the rules are able to define what the criteria are (i.e., the conditions) for when an action is to be initiated, and thus is easily able to cover a variety of constraints and scenarios.
- the rules engine is not artificially intelligent, but rather is able to be programmed with rules to meet specific needs. Thus, it is possible to write rules to expect specific things or not expect specific things, but it is not possible to write a rule that says to ‘watch for something unusual’. To watch for something unusual is really to define everything that is usual and then declare what the unusual patterns are. Since the device input drivers convert data from different devices into a canonical form, rules may be written that look for conditions across different devices.
- the event and rules correlating process 140 - 2 evaluates a rule from the plurality of rules within the rules engine at a frequency over a period of time using data contained within event objects, so to determine whether the one or more conditions defined within the evaluated rule are met, step 404 . That is, the event and rules correlating process 140 - 2 causes the rules engine to look at the conditions defined by a rule and see if there are event objects, and in some cases other facts, containing data (i.e., describing events) that satisfy all of those conditions. Using the example rule shown in FIG.
- the event and rules correlating process 140 - 2 looks for one or more event objects identifying location(s) to which access is restricted, and for one or more event objects identifying a person in such a restricted location even though that person does not have clearance to be in the restricted location. If the event and rules correlating process 140 - 2 finds such event objects (and the current time, determined as discussed below, is after hours), then the event and rules correlating process 140 - 2 will initiate the actions defined by the example rule, as discussed further below.
- the event and rules correlating process 140 - 2 may inject one or more facts into rules engine, step 406 .
- the one or more facts may be used to evaluate one or more rules.
- the example rule shown in FIG. 3B requires knowledge of current time of day.
- the event and rules correlating process 140 - 2 is able to acquire that time from a source (a database or a clock), and then inject it into the rules engine.
- the rules engine is able to find out the time, and evaluate the rule appropriately.
- the event and rules correlating process 140 - 2 may use a database aggregator to perform these actions.
- the data aggregator buffers out of band of the input pipeline events that are targeted for the database.
- the data aggregator performs the buffering for a short period of time (e.g., seconds) and then writes the event data to the database as a collective blob.
- the event and rules correlating process 140 - 2 pulls these blobs from the database in order, unpacks the individual events within the blobs, and serially (i.e., in order) inserts those events into the input pipeline stages, to be re-processed and re-entered into the rules engine for the forensics analysis.
- These components, as well as the input pipeline, the rules engine, and other components of a system upon which the event and rules correlating process 140 - 2 may operate, are shown in example system 200 of FIG. 2B .
- the event and rules correlating process 140 - 2 initiates the one or more actions defined by the evaluated rule whenever the one or more conditions defined by that rule are met, step 405 .
- An action may be any command that an associated device is capable of carrying out.
- the event and rules correlating process 140 - 2 may be associated with any number of security-related devices (video cameras, sensors of all types, microphones and other audio recording devices, NDVRs, lighting systems, badge readers, electronic locks, etc.) and is able to interface via the network interface 115 (shown in FIG. 1 ) to other network-enabled devices and (computer systems, etc.) and networks themselves, the number of possible actions, as well as the depth of possible actions, is quite large.
- actions may include, for example, sending messages using any number of transmission media (i.e., e-mail, phone call, facsimile, page, text message, video message, printout to a printing device, pop-up box or other alert mechanism displayed on a display device, etc.), and sending control commands to any associated devices, as is described in greater detail with regards to FIG. 7 below.
- transmission media i.e., e-mail, phone call, facsimile, page, text message, video message, printout to a printing device, pop-up box or other alert mechanism displayed on a display device, etc.
- the event and rules correlating process 140 - 2 addresses scenarios in which a number of small events, each not troubling on its own, is recognized as being a part of a larger event, such that when combined, the number of small events may represent a troubling problem.
- the event and rules correlating process 140 - 2 is capable of dealing with events that are not very interesting singly, or even in simple patterns, but are interesting if, for example, too many of them happen in too short a time period.
- the event and rules correlating process 140 - 2 first receives security event data via an event pipeline, security event data defining an occurrence of an event, wherein each type of source of event data has its own pipeline, step 501 . Then the event and rules correlating process 140 - 2 creates event objects from the received security event data, step 502 . The event and rules correlating process 140 - 2 gathers the event objects into a rules engine, the rules engine comprising a plurality of rules, wherein each rule defines one or more conditions to be met and one or more actions to be taken in response, step 503 .
- the event and rules correlating process 140 - 2 may then assign weights to a group of rules within the rules engine, step 504 .
- a value of the weight assigned to that rule is added to an accumulator.
- Any number of rules may be included in the group of rules.
- the group of rules represent a number of events that, when considered separately, are of little importance, but when they occur simultaneously or otherwise near each other, for example, may be of much greater importance. For example, an event such as a door handle to a building being jiggled is, by itself, a scenario perhaps worthy of little attention.
- the event and rules correlating process 140 - 2 may assign weights in any number of ways, and to any grouping of rules (and thus grouping of events). For example, the event and rules correlating process 140 - 2 may assign the jiggling of a door handle a lesser weight than the breaking of a window. Further, weights may be based on duration as well as time of day. That is, the event and rules correlating process 140 - 2 may assign weights such that, the same door handle being jiggled five times in a week is assigned a lesser weight than the same door handle being jiggled five times in a minute.
- the event and rules correlating process 140 - 2 may assign weights such that, a door handle being jiggled at noon, while the building is full of people, has a lesser weight than a door handle being jiggled at midnight, after the building has been unoccupied for hours.
- the weight the event and rules correlating process 140 - 2 assigns to an event does not have to remain fixed.
- a weight may also be affected by how long the event remains in the rules engine.
- the event and rules correlating process 140 - 2 may calculate and assign a weight that is related to the time since the event and rules correlating process 140 - 2 created the event object representing that event, and that is further decremented as more time passes.
- the event and rules correlating process 140 - 2 may increase a weight as the time since an event object was created increases.
- the event and rules correlating process 140 - 2 may increase the weight assigned to that event because the event has not been dealt with.
- the event and rules correlating process 140 - 2 must, in addition to assigning weights to existing rules, also include an accumulator rule in the rules engine, step 505 .
- a condition of the accumulator rule is defined as the values in the accumulator associated with that rule exceeding a given value.
- the accumulator rule also defines one or more actions to occur upon the condition being met.
- the accumulator rule is what allows the event and rules correlating process 140 - 2 to monitor the value stored in an accumulator, and then take action if needed.
- the event and rules correlating process 140 - 2 may, of course, include a number of accumulator rules in the rules engine, all covering different rules from the same group of rules to which the event and rules correlating process 140 - 2 assigned weights.
- the group of rules to which the event and rules correlating process 140 - 2 assigned weights may include rules related to badge readers, rules related to video cameras, and rules related to infrared sensors.
- the event and rules correlating process 140 - 2 may include an accumulator rule that, for example, deals only with the badge reader rules, as well as an accumulator rule that deals with only the video camera rules, and an accumulator rule that deals with the badge reader rules, the video camera rules, and the infrared sensor rules.
- the variety of possible accumulator rules allows a user of the event and rules correlating process 140 - 2 to effectively define what combinations of events, and what combination of weights for those combination of events, result in one or more actions being taken.
- the event and rules correlating process 140 - 2 will then evaluate an accumulator rule at a frequency over a period of time, step 506 , similar to how the event and rules correlating process 140 - 2 evaluates any rule stored within the rules engine. Finally, upon the condition(s) of an accumulator rule being met, the event and rules correlating process 140 - 2 triggers the one or more actions defined in the accumulator rule to occur, step 507 .
- an accumulator may have different levels associated with it.
- an accumulator rule may indicate that, when the value in the accumulator passes a first given value, the events being accumulated may represent a low-level threat, and certain related actions result (i.e., providing an alert to security personnel).
- the event in the accumulator passes a second given value the events being accumulated may represent a mid-level threat, and certain other actions result (i.e., security personnel being summoned to a particular area).
- the event and rules correlating process 140 - 2 may visualize each level on the graphical user interface 160 (shown in FIG. 1 ), such as, for example, as a dial or bar, or using a color-coded scheme.
- a plurality of accumulators may be associated with particular geographic locations. That is, a first accumulator may be associated with a first building, a second accumulator associated with a second building, and so on. Alternatively, or additionally, a plurality of accumulators may be associated with a hierarchy of geographic locations. Thus, a first accumulator may be associated with a first floor of a first building, a second accumulator may be associated with a first floor of a second building, while a third accumulator may be associated with the first building and the second building.
- accumulators may count events themselves, instead of weights associated with events.
- an accumulator rule could count, for example, the number of people entering a room, and if that count exceeded a given value defined in the accumulator rule, an action would result.
- FIG. 6 illustrates the event and rules correlating process 140 - 2 recognizing a number of events as a situation, and then injecting the situation back into the rules engine for further processing over time.
- the event and rules correlating process 140 - 2 first receives security event data via an event pipeline, security event data defining an occurrence of an event, wherein each type of source of event data has its own pipeline, step 601 .
- the event and rules correlating process 140 - 2 then creates event objects from the received security event data, step 602 , and gathers the event objects into a rules engine, step 603 , as described herein.
- the rules engine comprises a plurality of rules, wherein each rule defines one or more conditions to be met and one or more actions to be taken in response.
- the event and rules correlating process 140 - 2 evaluates a rule from the plurality of rules within the rules engine at a frequency over a period of time using data contained within event objects, step 604 , so to determine whether the one or more conditions defined within the evaluated rule are met.
- the event and rules correlating process 140 - 2 may identify, within the rules engine, a situation, wherein a situation is the occurrence of one or more events, step 606 .
- a situation is number of events that occur simultaneously or in a span of a period of time.
- Situations may include any number of events. Events comprising a situation may or may not be related. As discussed further below, a situation may change over time. Further, any number of situations may occur simultaneously, and multiple situations may include some of the same events.
- the event and rules correlating process 140 - 2 identifies a situation by detecting that a number of rules have evaluated positively and actions defined in those rules have occurred. That is, the event and rules correlating process 140 - 2 detects that the conditions for a number of rules, each rule corresponding to an event, the conditions corresponding to the occurrence of the event, have all been met, where the occurrence of those events constitutes the situation.
- a situation may be a coordinated attempt to rob a bank.
- the events that make up the situation may include power being cut to video cameras, phone lines being cut, cellular phones being blocked, lights being turned off, incorrect combinations being given to a vault, and so on.
- the rules engine includes rules that describe all of these events through one or more conditions.
- the event and rules correlating process 140 - 2 detects that all of these conditions for these rules have been met, such that actions defined in the rules are initiated, the event and rules correlating process 140 - 2 identifies a situation.
- the situation contains the events described by all the met conditions of the rules.
- the event and rules correlating process 140 - 2 will then create, in a situation manager, a situation object, step 607 .
- the situation object includes data from one or more event objects that describe the one or more events.
- the included data identifies the one or more events that are occurring that make up the situation.
- the situation object further includes one or more actions to be taken upon occurrence of the situation.
- the event and rules correlating process 140 - 2 injects the situation object from the situation manager into the rules engine, step 608 .
- the event and rules correlating process 140 - 2 is able to maintain state of all the event data that identifies the events that comprise the situation. That is, a situation tends to change over time.
- the thieves may acquire the correct combination to a vault through coercion or other means, and thus they may be able to open that vault.
- the event and rules correlating process 140 - 2 is then able to update situation object, as described below, so that the event data shows that the correct combination was provided and thus the vault is now opened, as opposed to unsuccessful attempts being made to provide the correct combination to the vault.
- the rules engine will include one or more rules where a first condition to be met is that a situation is occurring, and where a second condition to be met is that one or more further events occur.
- the event and rules correlating process 140 - 2 upon occurrence of the one or more further events (i.e., the proper combination being provided to the bank vault so that is opens), changes the situation object corresponding to the situation by adding the one or more further events to the situation object, step 609 .
- events may be deleted from a situation object, or otherwise modified (e.g., updated), instead of, or in addition to, being added.
- the event and rules correlating process 140 - 2 may also, in response to doing so, further change the situation object by changing the one or more actions to be taken upon occurrence of the situation, step 610 .
- the change in the one or more actions may be based upon the occurrence of the one or more further events.
- the event and rules correlating process 140 - 2 may, upon the correct combination being provided to open the vault, change the situation object so that one of the actions taken in response to the situation is to activate alarms and warning lights associated with the vault, despite the vault being opened by the proper combination.
- the event and rules correlating process 140 - 2 would thus initiate the one or more actions defined by the situation object whenever the situation is occurring, and as the actions defined therein changed, the event and rules correlating process 140 - 2 would take appropriate steps to initiated those changes as well.
- the event and rules correlating process 140 - 2 initiates one or more actions defined in an evaluated rule by sending appropriate commands to control particular devices, and in some embodiments, to intelligently control video devices.
- the event and rules correlating process 140 - 2 first receives security event data via an event pipeline, security event data defining an occurrence of an event, wherein each type of source of event data has its own pipeline, step 701 .
- the event and rules correlating process 140 - 2 then creates event objects from the received security event data, step 702 , and gathers the event objects into a rules engine, step 703 , as described herein.
- the rules engine comprises a plurality of rules, wherein each rule defines one or more conditions to be met and one or more actions to be taken in response.
- the event and rules correlating process 140 - 2 evaluates a rule from the plurality of rules within the rules engine at a frequency over a period of time using data contained within event objects, step 704 , so to determine whether the one or more conditions defined within the evaluated rule are met. Finally, the event and rules correlating process 140 - 2 initiates the one or more actions defined by the evaluated rule whenever the one or more conditions defined by that rule are met, step 705 . Actions may include any of the commands described herein.
- the event and rules correlating process 140 - 2 initiates the one or more actions defined by the evaluated rule whenever the one or more conditions defined by that rule are met, wherein one of the one or more actions includes intelligent controlling of a video device based on received security event data, step 706 .
- intelligent controlling of a video device may include, for example, allowing a user to use a series of video cameras to track an individual both forwards and backwards through an area under surveillance of the series of video cameras.
- intelligent controlling of a video device may include intelligently controlling a sequence of cameras in order to provide a video tour of an area.
- Such methods and apparatus are disclosed in co-pending U.S. patent application Ser. No. ______, entitled “INTELLIGENT VIDEO TOURS”, given Attorney Docket No. VID08-04, the entirety of which is hereby incorporated by reference.
- intelligently controlling a video device uses automated functionality to assist a user in deciding which video device(s) to use and how that (those) device(s) should be used (i.e., the direction in which the device should be aimed, etc.).
- the event and rules correlating process 140 - 2 provides a standardized way of controlling a video device, or any other controllable device, that may be programmed by a user. That is, in such embodiments, the event and rules correlating process 140 - 2 creates an output package to cause one or more types of a device to execute one or more steps, step 707 .
- the event and rules correlating process 140 - 2 creates an output package for a type of a device using information output by the rules engine and then processed in an output pipeline.
- An output pipeline is essentially an input pipeline discussed above with regards to FIGS. 4 and 2B in reverse. Thus, an output pipeline is a textually defined stack used to control external devices, such as access control, lighting systems, etc.
- the event and rules correlating process 140 - 2 defines a stack that formulates a command message using various layers, the result being a command specific message for an external device (i.e., output package).
- An example of an output pipeline is shown in FIG. 2C .
- an output action 270 is passed to a stack 280 .
- the stack 280 may contain any number of layers, but in the example shown, contains a normalize layer 282 , a format layer 284 , and a transmit layer 286 , as well as an optional log layer, 288 .
- the event and rules correlating process 140 - 2 through the stack 280 , thus reformulates the output action 270 as an output package.
- the event and rules correlating process 140 - 2 first determines, from the output action 270 , the device to which the output package is to be sent, and the one or more steps it is to execute (i.e., commands), from the one or more actions defined by the evaluated rule.
- This information because it is generated by the event and rules correlating process 140 - 2 in the rules engine, is in the canonical form discussed above with regards to FIG. 4 .
- the device ultimately receiving the output package of course, does not necessarily understand commands formatted in such a form.
- the event and rules correlating process 140 - 2 in the normalize layer 282 , removes the standardization of the canonical form, and then, in the format layer 284 , formats the information for the particular type of the device that is receiving the output package. That is, for example, the event and rules correlating process 140 - 2 will format access control system commands (i.e., lock a door/window, unlock a door/window, etc.) according to the particular format of that brand of access control system that is receiving the output package.
- access control system commands i.e., lock a door/window, unlock a door/window, etc.
- the event and rules correlating process 140 - 2 will format lighting system commands (i.e., lights on, lights off, intensity of the light(s), etc.) according to the particular format of the brand of lighting system that is receiving the output package, such that each different type of lighting system that is to receive commands receives its own formatted output package.
- the event and rules correlating process 140 - 2 has created one or more output packages and now must send it/them to the appropriate device(s).
- the event and rules correlating process 140 - 2 thus transmits the created output package to the one or more types of the device, where upon receipt of its created output package, each of the one or more types of the device executes the one or more steps included in the output package, step 708 .
- the event and rules correlating process 140 - 2 does this through the transport layer 286 of the output pipeline.
- the transport layer 286 includes all the various transport protocols that may be used by the device(s) receiving the output packages, and uses the particular transport protocol for a particular device to send the output package for that device to that device.
- the event and rules correlating process 140 - 2 may use the log layer 288 to record some or all of the output packages being sent to device(s) for later analysis.
- the methods and systems described herein are not limited to a particular hardware or software configuration, and may find applicability in many computing or processing environments.
- the methods and systems may be implemented in hardware or software, or a combination of hardware and software.
- the methods and systems may be implemented in one or more computer programs, where a computer program may be understood to include one or more processor executable instructions.
- the computer program(s) may execute on one or more programmable processors, and may be stored on one or more storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), one or more input devices, and/or one or more output devices.
- the processor thus may access one or more input devices to obtain input data, and may access one or more output devices to communicate output data.
- the input and/or output devices may include one or more of the following: Random Access Memory (RAM), Redundant Array of Independent Disks (RAID), floppy drive, CD, DVD, magnetic disk, internal hard drive, external hard drive, memory stick, flash memory (i.e., solid state memory) device, or other storage device capable of being accessed by a processor as provided herein, where such aforementioned examples are not exhaustive, and are for illustration and not limitation.
- RAM Random Access Memory
- RAID Redundant Array of Independent Disks
- floppy drive CD, DVD, magnetic disk, internal hard drive, external hard drive, memory stick, flash memory (i.e., solid state memory) device, or other storage device capable of being accessed by a processor as provided herein, where such aforementioned examples are not exhaustive, and are for illustration and not limitation.
- the computer program(s) may be implemented using one or more high level procedural or object-oriented programming languages to communicate with a computer system; however, the program(s) may be implemented in assembly or machine language, if desired.
- the language may be compiled or interpreted.
- the processor(s) may thus be embedded in one or more devices that may be operated independently or together in a networked environment, where the network may include, for example, a Local Area Network (LAN), wide area network (WAN), and/or may include an intranet and/or the internet and/or another network.
- the network(s) may be wired or wireless or a combination thereof and may use one or more communications protocols to facilitate communications between the different processors.
- the processors may be configured for distributed processing and may utilize, in some embodiments, a client-server model as needed. Accordingly, the methods and systems may utilize multiple processors and/or processor devices, and the processor instructions may be divided amongst such single- or multiple-processor/devices.
- the device(s) or computer systems that integrate with the processor(s) may include, for example, a personal computer(s), workstation(s) (e.g., Sun, HP), personal digital assistant(s) (PDA(s)), handheld device(s) such as cellular telephone(s), laptop(s), handheld computer(s), or another device(s) capable of being integrated with a processor(s) that may operate as provided herein. Accordingly, the devices provided herein are not exhaustive and are provided for illustration and not limitation.
- references to “a microprocessor” and “a processor”, or “the microprocessor” and “the processor,” may be understood to include one or more microprocessors that may communicate in a stand-alone and/or a distributed environment(s), and may thus be configured to communicate via wired or wireless communications with other processors, where such one or more processor may be configured to operate on one or more processor-controlled devices that may be similar or different devices.
- Use of such “microprocessor” or “processor” terminology may thus also be understood to include a central processing unit, an arithmetic logic unit, an application-specific integrated circuit (IC), and/or a task engine, with such examples provided for illustration and not limitation.
- references to memory may include one or more processor-readable and accessible memory elements and/or components that may be internal to the processor-controlled device, external to the processor-controlled device, and/or may be accessed via a wired or wireless network using a variety of communications protocols, and unless otherwise specified, may be arranged to include a combination of external and internal memory devices, where such memory may be contiguous and/or partitioned based on the application.
- references to a database may be understood to include one or more memory associations, where such references may include commercially available database products (e.g., SQL, Informix, Oracle) and also proprietary databases, and may also include other structures for associating memory such as links, queues, graphs, trees, with such structures provided for illustration and not limitation.
- references to a network may include one or more intranets and/or the internet.
- References herein to microprocessor instructions or microprocessor-executable instructions, in accordance with the above, may be understood to include programmable hardware.
Abstract
Correlating security event data according to one or more rules to initiate one or more actions is disclosed. Security event data is received, via an event pipeline, that defines an occurrence of an event. Each type of source of event data may have its own pipeline. Event objects are created from the received security event data, and are gathered into a rules engine. The rules engine includes a plurality of rules. Each rule defines one or more conditions to be met and one or more actions to be taken in response. A rule is evaluated at a frequency over a period of time using data contained within event objects, so to determine whether the one or more conditions defined within the evaluated rule are met. The one or more actions defined by the evaluated rule are initiated whenever the one or more conditions defined by that rule are met.
Description
- This application claims the benefit of the filing date of the following earlier filed co-pending U.S. Provisional Patent Applications: Ser. No. 60/939,517, entitled “METHOD AND APPARATUS FOR EVENT CAPTURE, CROSS DEVICE EVENT CORRELATION, AND RESPONSIVE ACTIONS”, having Attorney Docket No. VID07-02p, filed on May 22, 2007; Ser. No. 60/939,503, entitled “METHOD AND APPARATUS FOR TRACKING PEOPLE AND OBJECTS USING MULTIPLE LIVE AND RECORDED SURVEILLANCE CAMERA VIDEO FEEDS”, having Attorney Docket VID07-01p, also filed on May 22, 2007; Ser. No. 60/939,521 entitled “METHOD AND APPARATUS FOR INTELLIGENT VIDEO TOURS”, having Attorney Docket VID07-03p, also filed on May 22, 2007; and Ser. No. 60/939,528, entitled “METHOD AND APPARATUS FOR OPTIMAL ROUTING OF AUDIO & VIDEO SIGNALS THROUGH HETEROGENEOUS NETWORKS”, having Attorney Docket VID07-04p, also filed on May 22, 2007. These all share co-inventorship with the present application. The entire teachings and contents of these Provisional Patent Applications are hereby incorporated by reference herein in their entirety.
- Securing an area from threats, both internal to the facility and external to it, has long been a desire of those who have something of value that may be desirable to others located within the area. Early conventional security mechanisms involved placing guards at points of entry to an area to be secured. Locks of differing strengths may also have been deployed on points of entry to the area, if the area was surrounded by walls or gates. With increasing sophistication in technology, guards were also deployed within areas (i.e., inside buildings) to patrol, and badge readers and other electronic entry devices were used to supplement locks.
- Guards, however, are expensive to pay, and also capable of error. Particularly for larger facilities/facilities with many possible points of entry, it may not be possible to hire enough guards to “watch” everything going on. Thus, automated devices such as security cameras have also been added to the mix of security measures. The addition of security cameras meant that security personnel could “see” all of the interesting areas of the facility (i.e., points of entry, locations where things of valued were stored, etc.). However, an increase in the number of cameras placed into a facility made it harder to watch all of the cameras simultaneously without hiring more personnel to watch the cameras. Doing so would remove the primary monetary advantage of using cameras, that is, not having to employ more personnel.
- One conventional solution to the “too many cameras” problem is for security personnel to pay particular attention to a subset of all the available cameras. In this scenario, the question becomes which cameras to watch, and which cameras to ignore. Typically, the areas of high value and high risk (e.g., a vault, primary entry and exit points such as the doors going in/out of a building, etc.) are given primary focus, and other areas of lesser value and/or risk are given secondary focus. These measures have served as an adequate defense against “low tech” threats (i.e., breaking and entering by common criminals).
- Conventional security mechanisms such as those explained above suffer from a variety of deficiencies. One such deficiency is that, with an increase in the number of more active and more dangerous threats (i.e., terrorist activities, etc.), the cost of “watching” to see if something bad happens versus being more proactive has become very high. That is, conventional security systems allowed personnel to discover and deal with problems as they happened. Now it is of far more value to be able to have some amount of warning before a problem develops fully. Thus, instead of using video for watching high value/risk areas, and using video for forensics after the fact, embodiments disclosed herein provide for security mechanisms that provide situational awareness through rules-based processing of events.
- Situational awareness includes an ability to take into account a variety of events that occur in and around a protected area or facility, including simultaneous or near-simultaneous events. For example, situational awareness may include, in some situations, being aware that: multiple people have gone through a door on a single badge swipe, through a combination of access control mechanism and video analytics); multiple doors are being tried, through door jiggle sensors; or perhaps something in a protected facility/area has gone out of an allowable range (i.e., level of fluid in a tank, acidity of a substance, etc.). By monitoring a number of events in and around a protected area/facility, and detecting unusual and/or unexpected events, embodiments bring such events to the attention of security personnel. Security personnel are then able to view areas of potential problems via traditional means (i.e., video cameras and network digital video recorders (NDVRs) versus waiting to be notified that a problem is in fact currently occurring.
- Situational awareness as provided by embodiments described herein may include a number of phases. In a first phase, responses to individual alerts, or “singletons”, occur. A singleton is a single event that causes one or more actions to take place, such as flipping cameras, sounding alarms, etc. Combining singletons results in more complicated scenarios that require a rules engines and correlation between events, as described herein. A second phase of situational awareness may be realizing that there is much more information to be mined by combining the results of many different types of events and correlating them. For example, it is not a particularly interesting event to know that someone has just logged into a computer located within a building, nor is it a particularly interesting event to know that someone has left the building. However, it is very interesting to know that the same person that just logged into the computer located within the building is also reported to not be in the building.
- By combining fairly simple unrelated events into a model of what is “normal” (i.e., typical or expected behavior) and what is not normal (i.e., atypical or unexpected behavior), embodiments focus security personnel on unusual events, instead of the security personnel waiting for an external event (i.e., a phone call) to occur to warn that a problem is in progress. Continuing the above example, it is far more advantageous to know that a potential problem exists, based on the known information, then to know after the fact that the company's computer system was hacked into via the CEO's computer by a person at a remote location.
- Embodiments of the invention thus include a rules engine used for correlating the events that come in from various statuses/alerting systems. The rules engine does not attempt to use any sort of Artificial Intelligence (AI) to learn what is normal, or to learn what is not normal. Rather, embodiments require that creators of rules, such as the programmers (system integrators) of the rules engine, determine what is not “normal”, and create rules to detect those “abnormal” situations. Note that defining anything that is “abnormal” or otherwise unusual is not required. Rather, serious thought about the normal course of events is required. Those thoughts may then be embodied as rules within the rules engine by programming them into the engine, and then looking for patterns of actual events that are counter to those rules. Using the example from above, the pattern of a person being noted to have left a building but then logging in to a computer that is supposed to be located within the building, is noticed.
- Use of a rules engine, and rules-based processing of event data, is in contrast to conventional systems that typically employ event-based processing. That is, conventional systems include thousands (or more) of lines of code, in an attempt to cover every possible event scenario that may be encountered. Every scenario that is thought of must have code to deal with just that particular scenario. The more complicated the scenario is, the more complex the code for that scenario also is. Thus, because the entirety of the code must be evaluated every time an event occurs, event-based processing results in a great deal of processing overhead being consumed. Further, conventional event-based processing systems may simply not include all possible scenarios. Rules-based processing through a rules engine, as described herein, is a more efficient way of not only accounting for, and processing, a variety of different scenarios, including increasingly complicated scenarios, but also is easier to formulate and less likely to leave scenarios out.
- Determining what is “abnormal” or “unusual” may be difficult for security and surveillance personnel. Programming rules that look for specific “bad” events are the first line of defense. However, even rules that look for too many unusual things over a specified period of time may also be very valuable. In the example of someone logging in while out of the building, it was immediately clear that this was a pattern of a potential issue. There is another class or series of events that may be neither good nor significantly bad when treated independently, and that may not be correlated as a rule. Such a class or series of events that are an indicator that something is a bit amiss. For example, a light out in a parking lot, a door camera reported broken, and a heat sensor running higher than expected, are individual events. None of these singleton events represents an imminent threat, but in combination, with perhaps many other trivial rules that represent exceptions to the “normal”, it is possible to weight and score these exceptions. Once these expectations reach a predefined threshold, it may be deduced that something might be wrong. This sort of early warning, in some scenarios, might be significant enough to cause a medium scale warning, or it might result in something as simple as an alert message (page, SIP call, etc.) being sent to a central area from which the security personnel may respond or otherwise act.
- Embodiments of the invention thus use events coming into the system and the rules that correlate them to provide situational awareness. The actions that are taken as a result and the policies that dictate how this is all acted upon may be varied from customer to customer. Thus, embodiments provide a broad and flexible security system that may easily be modified to meet a particular customer's needs.
- More particularly, in an embodiment, there is provided a method of correlating security event data according to one or more rules to initiate one or more actions. The method includes receiving security event data via an event pipeline, security event data defining an occurrence of an event, wherein each type of source of event data may have its own pipeline, and creating event objects from the received security event data. The method also includes gathering the event objects into a rules engine, the rules engine may include a plurality of rules, wherein each rule defines one or more conditions to be met and one or more actions to be taken in response. The method also includes evaluating a rule from the plurality of rules within the rules engine at a frequency over a period of time using data contained within event objects, so to determine whether the one or more conditions defined within the evaluated rule are met, and initiating the one or more actions defined by the evaluated rule whenever the one or more conditions defined by that rule are met.
- In a related embodiment, creating may include pre-processing the received security event data to create event objects that include the received security event data in a canonical form. In a further related embodiment, pre-processing may include supplementing an event object with related data not acquired from the source of the received security event data, by using an item of data within the received security event data to look up the related data, the related data used to set one or more attributes of the event object.
- In another related embodiment, the method may include prior to evaluating a rule, injecting one or more facts into rules engine, the one or more facts used to evaluate one or more rules. In yet another related embodiment, the method may include assigning weights to a group of rules within the rules engine, such that when the one or more conditions of a rule in the group of rules are met, a value of the weight assigned to that rule is added to an accumulator; including an accumulator rule in the rules engine, wherein a condition of the accumulator rule is defined as the values in the accumulator exceeding a given value, the accumulator rule defining one or more actions to occur upon the condition being met; evaluating the accumulator rule at a frequency over a period of time; and upon the condition of the accumulator rule being met, triggering the one or more actions defined in the accumulator rule to occur.
- In still yet another related embodiment, the method may include identifying, within the rules engine, a situation, wherein a situation is the occurrence of one or more events; creating, in a situation manager, a situation object, the situation object including data from one or more event objects that describe the one or more events, the included data identifying the one or more events that are occurring that make up the situation, the situation further including one or more actions to be taken upon occurrence of the situation; and injecting the situation object from the situation manager into the rules engine. In a further related embodiment, the rules engine may include one or more rules where a first condition to be met is that a situation is occurring, and where a second condition to be met is that one or more further events occur, and the method may include, upon occurrence of the one or more further events, changing the situation object corresponding to the situation by adding the one or more further events to the situation object. In another further related embodiment, the method may include, in response to adding the one or more further events to the situation object, further changing the situation object by changing the one or more actions to be taken upon occurrence of the situation, the change in the one or more actions based upon the occurrence of the one or more further events.
- In yet another related embodiment, initiating may include initiating the one or more actions defined by the evaluated rule whenever the one or more conditions defined by that rule are met, wherein one of the one or more actions includes intelligent controlling of a video device based on received security event data. In still another related embodiment, initiating may include creating an output package to cause one or more types of a device to execute one or more steps, the device and the one or more steps it is to execute determined from the one or more actions defined by the evaluated rule, wherein the output package is standardized for each type of the device; and transmitting the created output package to the one or more types of the device, where upon receipt of its created output package, each of the one or more types of the device executes the one or more steps included in the output package.
- In another embodiment, there is provided a computer program product, stored on computer readable medium, for correlating security event data according to one or more rules to initiate one or more actions. The computer program product includes computer program code for receiving security event data via an event pipeline, security event data defining an occurrence of an event, wherein each type of source of event data may have its own pipeline, and computer program code for creating event objects from the received security event data. The computer program product also includes computer program code for gathering the event objects into a rules engine, the rules engine comprising a plurality of rules, wherein each rule defines one or more conditions to be met and one or more actions to be taken in response, computer program code for evaluating a rule from the plurality of rules within the rules engine at a frequency over a period of time using data contained within event objects, so to determine whether the one or more conditions defined within the evaluated rule are met, and computer program code for initiating the one or more actions defined by the evaluated rule whenever the one or more conditions defined by that rule are met.
- In another embodiment, there is provided a computer system. The computer system includes a memory; a processor; a network interface; and an interconnection mechanism coupling the memory, the processor, and the network interface, allowing communication there between. The memory is encoded with an event and rules correlating application, that when executed in the processor, provides a an event and rules correlating process that correlates security event data according to one or more rules to initiate one or more actions. The event and rules correlating process causes the computer system to perform the operations of: receiving security event data via an event pipeline, security event data defining an occurrence of an event, wherein each type of source of event data may have its own pipeline; creating event objects from the received security event data; gathering the event objects into a rules engine, the rules engine comprising a plurality of rules, wherein each rule defines one or more conditions to be met and one or more actions to be taken in response; evaluating a rule from the plurality of rules within the rules engine at a frequency over a period of time using data contained within event objects, so to determine whether the one or more conditions defined within the evaluated rule are met; and initiating the one or more actions defined by the evaluated rule whenever the one or more conditions defined by that rule are met.
- Other arrangements of embodiments of the invention that are disclosed herein include software programs to perform the method embodiment steps and operations summarized above and disclosed in detail below. More particularly, a computer program product is one embodiment that has a computer-readable medium including computer program logic encoded thereon that when performed in a computerized device provides associated operations providing client management of download sequence of orchestrated content as explained herein. The computer program logic, when executed on at least one processor with a computing system, causes the processor to perform the operations (e.g., the methods) indicated herein as embodiments of the invention. Such arrangements of the invention are typically provided as software, code and/or other data structures arranged or encoded on a computer readable medium such as but not limited to an optical medium (e.g., CD-ROM, DVD-ROM, etc.), floppy or hard disk, a so-called “flash” (i.e., solid state) memory medium, or other physical medium, such as but not limited to firmware or microcode in one or more ROM or RAM or PROM chips, or as an Application Specific Integrated Circuit (ASIC), or as downloadable software images in one or more modules, shared libraries, etc. The software or firmware or other such configurations can be installed onto a computerized device to cause one or more processors in the computerized device to perform the techniques explained herein as embodiments of the invention. Software processes that operate in a collection of computerized devices, such as in a group of data communications devices or other entities may also provide the system of the invention. The system of the invention may be distributed between many software processes on several data communications devices, or all processes may run on a small set of dedicated computers, or on one computer alone.
- It is to be understood that embodiments of the invention may be embodied strictly as a software program, as software and hardware, or as hardware and/or circuitry alone. The features disclosed and explained herein may be employed in computerized devices and software systems for such devices such as those manufactured by VidSys, Inc., of Vienna, Va.
- The foregoing will be apparent from the following description of particular embodiments disclosed herein, as illustrated in the accompanying drawings in which like reference characters refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles disclosed herein.
-
FIG. 1 shows a high-level block diagram of a computer system according to one embodiment disclosed herein. -
FIGS. 2A-2C illustrate examples of system structures upon which a procedure performed by the system ofFIG. 1 may operate. -
FIGS. 3A-3E illustrate example code of various components of embodiments described byFIGS. 4-7 . -
FIG. 4 illustrates a flowchart of a procedure performed by the system ofFIG. 1 when correlating events and rules so as to initiate actions related to security procedures. -
FIG. 5 illustrates a flowchart of a procedure performed by the system ofFIG. 1 when assigning weights to various events such that, as weights accumulate, representing a group of events occurring in a period of time, particular actions are taken. -
FIG. 6 illustrates a flowchart of a procedure performed by the system ofFIG. 1 when addressing situations and providing a situation object back to the rules engine so that the object is able to be modified over time as the situation changes. -
FIG. 7 illustrates a flowchart of a procedure performed by the system ofFIG. 1 when initiating actions through use of commands sent to devices. - Generally, disclosed embodiments provide methods and apparatus for correlating events to rules and in response, taking one or more actions. Data pertaining to events is continually received over time from a variety of sources, such as video cameras, badge readers, sensors, RFID tags, and so on. This data is collected and normalized into a standard format for processing. Data may be deemed unimportant and thus left out, or otherwise filtered out for any number of reasons. The data is processed into structures that may be supplemented with additional related data. The structures are then loaded into a rules engine that contains a plurality of rules defining different event scenarios (i.e., singleton events or combinations of events) that may occur, as well as one or more actions to be taken in response to the occurrence of the scenario(s) defined in the rule(s). The rules are continually evaluated over a period of time, and whenever the condition(s) for a rule is(are) met, the action(s) associated with that rule is(are) taken.
- More particularly,
FIG. 1 is a block diagram illustrating example architecture of acomputer system 110 that executes, runs, interprets, operates or otherwise performs an event and rules correlation application 140-1 and an event and rules correlation process 140-2 suitable for use in explaining example configurations disclosed herein. Thecomputer system 110 may be any type of computerized device such as a personal computer, workstation, portable computing device, console, laptop, network terminal or the like. As shown in this example, thecomputer system 110 includes aninterconnection mechanism 111 such as a data bus or other circuitry that couples amemory system 112, aprocessor 113, an input/output interface 114, and acommunications interface 115. An input device 116 (e.g., one or more user controlled devices such as a keyboard, mouse, touchpad, trackball, etc.) couples to theprocessor 113 through the I/O interface 114 and enables auser 108 such as a web page developer to provide input commands and generally control agraphical user interface 160 shown on adisplay 130. Thecommunications interface 115 enables thecomputer system 110 to communicate with other devices (e.g., other computers) on a network (not shown inFIG. 1 ). - The
memory system 112 is any type of computer readable medium and in this example is encoded with an event and rules correlation application 140-1. The event and rules correlation application 140-1 may be embodied as software code such as data and/or logic instructions (e.g., code stored in the memory or on another computer readable medium such as a removable disk) that supports processing functionality according to different embodiments described herein. During operation of thecomputer system 110, theprocessor 113 accesses thememory system 112 via theinterconnection mechanism 111 in order to launch, run, execute, interpret or otherwise perform the logic instructions of the event and rules correlation application 140-1. Execution of the event and rules correlation application 140-1 in this manner produces processing functionality in an event and rules correlation process 140-2. In other words, the event and rules correlation process 140-2 represents one or more portions or runtime instances of the event and rules correlation application 140-1 performing or executing within or upon theprocessor 113 in thecomputer system 110 at runtime. - It is noted that example configurations disclosed herein include the event and rules correlation application 140-1 itself including the event and rules correlation process 140-2 (i.e., in the form of un-executed or non-performing logic instructions and/or data). The event and rules correlation application 140-1 may be stored on a computer readable medium (such as a floppy disk), hard disk, electronic, magnetic, optical or other computer readable medium. The event and rules correlation application 140-1 may also be stored in a
memory system 112 such as in firmware, read only memory (ROM), or, as in this example, as executable code in, for example, Random Access Memory (RAM). In addition to these embodiments, it should also be noted that other embodiments herein include the execution of the event and rules correlation application 140-1 in theprocessor 113 as the event and rules correlation process 140-2. Those skilled in the art will understand that thecomputer system 110 may include other processes and/or software and hardware components, such as an operating system not shown in this example. - The
display 130 need not be coupled directly tocomputer system 110. For example, the event and rules correlation application 140-1 may be executed on a remotely accessible computerized device via thenetwork interface 115. In this instance, thegraphical user interface 160 may be displayed locally to a user of the remote computer and execution of the processing herein may be client-server based. In some embodiments, thegraphical user interface 160 may be a customer interface through which a user, such as theuser 108, is able to learn various information and take various actions. The amount of features, and control thereof, may depend on a user level, such that a basic user has access to only a certain amount of features, while an administrator may have access to all available features. Key features of thegraphical user interface 160 may include the ability to locate and activate cameras easily from a searchable directory; the ability to locate and activate cameras easily from a map representation; the ability to control Cameras and NDVRs to effect Pan-Tilt-Zoom (PTZ), Iris, Focus, Playback, etc.; the ability to list and get details on incoming events; the ability to locate people and use collaboration tools to interact with them; the ability to locate shared folders and reference and active files; the ability to control video walls; the ability to initiate video tours, which are described in detail in co-pending U.S. patent application Ser. No. ______, entitled “INTELLIGENT VIDEO TOURS” and given Attorney Docket VID08-04; and administrative features such as but not limited to create/modify/list/delete users, roles, tours, devices, etc. -
FIGS. 2A-2C illustrate examples of system structures upon which the event and rules correlating process 140-2 may operate.FIGS. 3A-3E include example code of various components of embodiments described below, and are referenced throughout the discussion ofFIGS. 4-7 .FIGS. 4-7 are flowcharts of various embodiments of the event and rules correlating process 140-2. The rectangular elements are herein denoted “processing blocks” and represent computer software instructions or groups of instructions. Alternatively, the processing blocks represent steps performed by functionally equivalent circuits such as a digital signal processor circuit or an application specific integrated circuit (ASIC). The flowcharts do not depict the syntax of any particular programming language. Rather, the flowcharts illustrate the functional information one of ordinary skill in the art requires to fabricate circuits or to generate computer software to perform the processing required in accordance with the present invention. It should be noted that many routine program elements, such as initialization of loops and variables and the use of temporary variables are not shown. It will be appreciated by those of ordinary skill in the art that unless otherwise indicated herein, the particular sequence of steps described is illustrative only and may be varied without departing from the spirit of the invention. Thus, unless otherwise stated, the steps described below are unordered, meaning that, when possible, the steps may be performed in any convenient or desirable order. -
FIG. 4 illustrates an embodiment of the event and rules correlating application 140-1, executing as the event and rules correlating process 140-2, to correlate security event data according to one or more rules to initiate one or more actions. The event and rules correlating process 140-2 first receives security event data via an event pipeline,step 401. Security event data defines an occurrence of an event. An event is simply something that takes place and is, in some manner, recorded or otherwise noticed by a device to which the event and rules correlating process 140-2 has access. Thus, for example, when a person uses a security badge at a badge reader at the door to a particular room, to enter that room, the security event data may include the id number of the badge and the location of the badge reader. The event would be using the security badge. A further event may be the opening and closing of the door, if there is a sensor on the door capable of noticing that event. The event data for such a thing would be a reading from the sensor, first indicating the opening of the door, and then indicating the shutting of the door. - The number of devices that may serve as sources of event data for the event and rules correlating process 140-2 is large. The event and rules correlating process 140-2 is able to interface with any number of security-related devices (video cameras, sensors, lighting systems, badge readers, electronic locks, etc.). Those devices differ, not only in function, but the format of the data they provide and how they are controlled. Thus, the event and rules correlating process 140-2 must be flexible enough to deal with hundreds, perhaps thousands, of different data formats. The event and rules correlating process 140-2 achieves this flexibility through use of an input pipeline. An input pipeline as used by the event and rules correlating process 140-2 to receive event data, such as the example input pipeline shown in
FIG. 2A , may be thought of as an externally definedcustomizable stack 210 of various layers. Each layer in thestack 210 is relatively simple and is configured to do a single or small set of functions. The layers may include atransport layer 212, aparser layer 214, and a normalizelayer 216. The layers may also include optional log layers 218 and 220, and anoptional filtering layer 219. Supporting any specific type of source of event data (i.e., device such as a camera, badge reader, etc.) is simply a matter of constructing a stack that includes appropriate layers (i.e., layers that are able to communicate with the device and format the event data into a form that is suitable for further processing by the event and rules correlating process 140-2 in the rules engine, as is discussed below). Example code of an example input pipeline including various layers in its stack is shown inFIG. 3A . - Note that, in some embodiments, each type of source of event data has its own pipeline. That is, cameras may have their own pipeline (not necessarily for video but rather, for example, for analytic data), while sensors may have their own pipeline, badge readers may have their own pipeline, and so on. A type of source need not be simply the different devices that deliver event data to the event and rules correlating process 140-2, but may also include variations within those devices. Thus, a first badge reader manufactured by one company may have a different data format and control mechanism than a second badge reader manufactured by another company. The event and rules correlating process 140-2 thus has a pipeline for the first badge reader (and any/all other badge readers of the same type) and a separate pipeline for the second badge reader (and any/all other badge readers of the same type). Alternatively, in some embodiments, different types of sources of event data may share a pipeline as is appropriate.
- In the
transport layer 212, the event and rules correlating process 140-2, through the pipeline on its external side, communicates with the source of the event data (i.e., a device) using the language and transport (i.e. protocol) that is defined by the source. Thus, examples of transports may include but are not limited to HTTP, File, SNMP, SMTP, SOAP, TCP, and the like. Thetransport layer 212 takes event data in and provides it as raw event data to theparser layer 214. The event data, after coming out of a transport layer, is essentially an unpackaged byte stream. - The
parser layer 214 may include rules for pulling fields from XML, or running regular expressions, or busting apart ASCII strings into a particular form. Theparser layer 214 takes the raw event data and its attributes, as delivered by thetransport layer 212, and converts that raw event data into a structure that may be operated on by the remaining layers of the pipeline. Note that theparser layer 214 does not understand or know, or make judgments on, the raw event data received by the event and rules correlating process 140-2. Rather, theparser layer 214 does a format specific conversion from the incoming byte stream (i.e., raw event data) to an implementation-specific format for the remainder of the pipeline. For example, if the incoming raw event data were an HTTP GET request with name value arguments in the query string, theparser layer 214 might convert those to the internal format, as perhaps an XML document. Alternatively, if the internal format was a name value string, theparser layer 214 may pass the ones provided along, but add to it the transport specific fields, original URL, arrival time, host, etc. - The event and rules correlating process 140-2 next creates event objects from the received security event data,
step 402. An event object is a data structure that may, and typically does, include one or more attributes. That is, an event object may be thought of a fact, with other related information included as well. In general, the more facts the event and rules correlating process 140-2 has, the better. As the raw security event data received by the event and rules correlating process 140-2 may be in any one of a number of formats, for the event and rules correlating process 140-2 to easily use that data, the event and rules correlating process 140-2 must first place it into a standardized format. That is, the event and rules correlating process 140-2 may create an event object by pre-processing the received security event data to create event objects that include the received security event data in a canonical form,step 406. The pre-processing may include transforming, combining, and/or deleting fields or even events, among other actions. The event and rules correlating process 140-2 performs this pre-processing (and thus the creation of the event object) in at least thenormalizer layer 216. - The fact inside the event object, along with attributes, essentially defines what the event was. For example, using the badge strike example discussed above, the event object created by the event and rules correlating process 140-2 for that event data would include the badge id. The event object may also include related attributes, such as information that identifies the location of the badge reader, the owner of the badge with that badge id, and so on. The related attributes do not come from the source of the event data, but rather must be acquired from another source or sources. Thus, the event and rules correlating process 140-2 may supplement an event object with related data not acquired from the source of the received security event data, by using an item of data within the received security event data to look up the related data, the related data used to set one or more attributes of the event object, step 407. Thus, the event and rules correlating process 140-2 may use the badge id to perform a lookup in a database to determine who the owner of the badge id is, and may place this information in an attribute of the event object. In some embodiments, the event and rules correlating process 140-2 may use a first attribute to acquire other attributes. That is, the event and rules correlating process 140-2 may use the name of the owner of the badge with the received badge id to lookup that person's permissions for entering various restricted areas of a protected facility. The event and rules correlating process 140-2 may also set those permissions as attributes within the event object, or in a separate event object.
- In some embodiments, the event and rules correlating process 140-2 may use the
optional filtering layer 219 to remove unnecessary data from an event object. For example, data from a sensor reporting its current value may arrive five times in a short duration of time (say, 10 seconds). The event and rules correlating process 140-2 may use thefiltering layer 219 to determine that, because the value reported by the sensor is within an accepted range of error, the sensor has essentially reported the same value five times, and the event object need not contain that value five times over; once will suffice. Thus, the event and rules correlating process 140-2 may be programmed to store or otherwise note a current value of data for later comparisons. Alternatively, the event and rules correlating process 140-2 may use thefiltering layer 219 to only remove certain pieces of data. For example, the same sensor may, over a ten minute period of time, occasionally report the same value, but viewed over the ten minutes, the value reported by the sensor shows a steady increase. The event and rules correlating process 140-2 may use thefiltering layer 219 to remove only unnecessary repeats of a value, to thus make sure the event object contains the values showing that increase over time. In such a situation, the event and rules correlating process 140-2 is thus configured to not only maintain a current value, but also a number of recent values as well. - Note that an input pipeline as used by the event and rules correlating process 140-2 is generic, such that it may be extended to support more devices by adding support to the event and rules correlating process 140-2 for transports, parsers, and transforms for those devices. When sources of event data (i.e., devices) that the event and rules correlating process 140-2 communicates with via the pipeline have unusual characteristics, custom transports and/or customer parsers may have to be developed. Further note that event data in a pipeline may be logged at any time, by using either of the two optional log layers 218 and/or 220.
- The event and rules correlating process 140-2 then gathers the event objects into a rules engine,
step 403. The rules engine comprises a plurality of rules. Each rule defines one or more conditions to be met and one or more actions to be taken in response. Example rules are shown inFIGS. 3B-3E . Rules are loaded from an external file and are thus not hard coded. Rules may be created by users, in some embodiments as XML files, or via a plug-in that supports a semi-natural language way to create syntax for proper rules, or via a graphical user interface. Some rules represent singleton events, that is, only one condition need occur. For example, a rule may be defined to send an alert message when a device including an RFID tag crosses a threshold point. Other rules may represent multiple events of varying complexity. Thus, such a rule may require that a device including an RFID tag crosses a threshold point and that prior to this, a badge was used to open the door at the threshold point. Patterns may important in some rules, while order may be important in others. The rules are able to define what the criteria are (i.e., the conditions) for when an action is to be initiated, and thus is easily able to cover a variety of constraints and scenarios. - The rules engine is not artificially intelligent, but rather is able to be programmed with rules to meet specific needs. Thus, it is possible to write rules to expect specific things or not expect specific things, but it is not possible to write a rule that says to ‘watch for something unusual’. To watch for something unusual is really to define everything that is usual and then declare what the unusual patterns are. Since the device input drivers convert data from different devices into a canonical form, rules may be written that look for conditions across different devices.
- The event and rules correlating process 140-2 evaluates a rule from the plurality of rules within the rules engine at a frequency over a period of time using data contained within event objects, so to determine whether the one or more conditions defined within the evaluated rule are met, step 404. That is, the event and rules correlating process 140-2 causes the rules engine to look at the conditions defined by a rule and see if there are event objects, and in some cases other facts, containing data (i.e., describing events) that satisfy all of those conditions. Using the example rule shown in
FIG. 3B , the event and rules correlating process 140-2 looks for one or more event objects identifying location(s) to which access is restricted, and for one or more event objects identifying a person in such a restricted location even though that person does not have clearance to be in the restricted location. If the event and rules correlating process 140-2 finds such event objects (and the current time, determined as discussed below, is after hours), then the event and rules correlating process 140-2 will initiate the actions defined by the example rule, as discussed further below. - In some embodiments, prior to evaluating a rule, the event and rules correlating process 140-2 may inject one or more facts into rules engine,
step 406. The one or more facts may be used to evaluate one or more rules. For example, the example rule shown inFIG. 3B requires knowledge of current time of day. The event and rules correlating process 140-2 is able to acquire that time from a source (a database or a clock), and then inject it into the rules engine. When the event and rules correlating process 140-2 causes the rules engine evaluates the rule shown inFIG. 3B , the rules engine is able to find out the time, and evaluate the rule appropriately. - There are cases where rules located in the rules engine are not sufficient for the event and rules correlating process 140-2 to detect what needs to be detected. As real world events happen, a user might want to go back and use the event and rules correlating process 140-2 to run a set of rules against captured device events. In other words, the user may wish to user the event and rules correlating process 140-2 to perform forensics analysis of event data. To provide such functionality, in some embodiments the event and rules correlating process 140-2 records and time stamps all event data and/or event objects, so they may be re-evaluated against a new set of rules at a later time. The event and rules correlating process 140-2 may use a database aggregator to perform these actions. The data aggregator buffers out of band of the input pipeline events that are targeted for the database. The data aggregator performs the buffering for a short period of time (e.g., seconds) and then writes the event data to the database as a collective blob. When in forensics mode, the event and rules correlating process 140-2 pulls these blobs from the database in order, unpacks the individual events within the blobs, and serially (i.e., in order) inserts those events into the input pipeline stages, to be re-processed and re-entered into the rules engine for the forensics analysis. These components, as well as the input pipeline, the rules engine, and other components of a system upon which the event and rules correlating process 140-2 may operate, are shown in
example system 200 ofFIG. 2B . - Finally, the event and rules correlating process 140-2 initiates the one or more actions defined by the evaluated rule whenever the one or more conditions defined by that rule are met,
step 405. An action may be any command that an associated device is capable of carrying out. As the event and rules correlating process 140-2 may be associated with any number of security-related devices (video cameras, sensors of all types, microphones and other audio recording devices, NDVRs, lighting systems, badge readers, electronic locks, etc.) and is able to interface via the network interface 115 (shown inFIG. 1 ) to other network-enabled devices and (computer systems, etc.) and networks themselves, the number of possible actions, as well as the depth of possible actions, is quite large. More particularly, actions may include, for example, sending messages using any number of transmission media (i.e., e-mail, phone call, facsimile, page, text message, video message, printout to a printing device, pop-up box or other alert mechanism displayed on a display device, etc.), and sending control commands to any associated devices, as is described in greater detail with regards toFIG. 7 below. - In
FIG. 5 , the event and rules correlating process 140-2 addresses scenarios in which a number of small events, each not troubling on its own, is recognized as being a part of a larger event, such that when combined, the number of small events may represent a troubling problem. Thus, the event and rules correlating process 140-2 is capable of dealing with events that are not very interesting singly, or even in simple patterns, but are interesting if, for example, too many of them happen in too short a time period. - The event and rules correlating process 140-2 first receives security event data via an event pipeline, security event data defining an occurrence of an event, wherein each type of source of event data has its own pipeline,
step 501. Then the event and rules correlating process 140-2 creates event objects from the received security event data,step 502. The event and rules correlating process 140-2 gathers the event objects into a rules engine, the rules engine comprising a plurality of rules, wherein each rule defines one or more conditions to be met and one or more actions to be taken in response,step 503. - The event and rules correlating process 140-2 may then assign weights to a group of rules within the rules engine, step 504. Thus, when the one or more conditions of a rule in the group of rules are met, a value of the weight assigned to that rule is added to an accumulator. Any number of rules may be included in the group of rules. The group of rules represent a number of events that, when considered separately, are of little importance, but when they occur simultaneously or otherwise near each other, for example, may be of much greater importance. For example, an event such as a door handle to a building being jiggled is, by itself, a scenario perhaps worthy of little attention. The hand of a person attempting to enter that building may have simply slipped off the handle, or the handle might be slightly stuck, causing a person to jiggle it to get the door to open. However, when, in a short period of time, handles on multiple doors, all at entrance points to a building, are jiggled, and then a number of windows on the ground floor of the building are jiggled as well, this may well represent a scenario of importance.
- The event and rules correlating process 140-2 may assign weights in any number of ways, and to any grouping of rules (and thus grouping of events). For example, the event and rules correlating process 140-2 may assign the jiggling of a door handle a lesser weight than the breaking of a window. Further, weights may be based on duration as well as time of day. That is, the event and rules correlating process 140-2 may assign weights such that, the same door handle being jiggled five times in a week is assigned a lesser weight than the same door handle being jiggled five times in a minute. Further, the event and rules correlating process 140-2 may assign weights such that, a door handle being jiggled at noon, while the building is full of people, has a lesser weight than a door handle being jiggled at midnight, after the building has been unoccupied for hours.
- Note that the weight the event and rules correlating process 140-2 assigns to an event does not have to remain fixed. In addition to a weight being affected by time and/or occurrence of an event in a duration of time, a weight may also be affected by how long the event remains in the rules engine. Thus, in some cases, as time passes since the occurrence of an event, the event and rules correlating process 140-2 may calculate and assign a weight that is related to the time since the event and rules correlating process 140-2 created the event object representing that event, and that is further decremented as more time passes. Alternatively, the event and rules correlating process 140-2 may increase a weight as the time since an event object was created increases. For example, if a temperature sensor on a tank containing a liquid shows a value above a safe level at 12:00 AM, and the temperature sensor is still showing a value above a safe level at 2:00 AM, the event and rules correlating process 140-2 may increase the weight assigned to that event because the event has not been dealt with.
- The event and rules correlating process 140-2 must, in addition to assigning weights to existing rules, also include an accumulator rule in the rules engine, step 505. A condition of the accumulator rule is defined as the values in the accumulator associated with that rule exceeding a given value. The accumulator rule also defines one or more actions to occur upon the condition being met. The accumulator rule is what allows the event and rules correlating process 140-2 to monitor the value stored in an accumulator, and then take action if needed. The event and rules correlating process 140-2 may, of course, include a number of accumulator rules in the rules engine, all covering different rules from the same group of rules to which the event and rules correlating process 140-2 assigned weights. That is, for example, the group of rules to which the event and rules correlating process 140-2 assigned weights may include rules related to badge readers, rules related to video cameras, and rules related to infrared sensors. The event and rules correlating process 140-2 may include an accumulator rule that, for example, deals only with the badge reader rules, as well as an accumulator rule that deals with only the video camera rules, and an accumulator rule that deals with the badge reader rules, the video camera rules, and the infrared sensor rules. In other words, the variety of possible accumulator rules allows a user of the event and rules correlating process 140-2 to effectively define what combinations of events, and what combination of weights for those combination of events, result in one or more actions being taken.
- The event and rules correlating process 140-2 will then evaluate an accumulator rule at a frequency over a period of time,
step 506, similar to how the event and rules correlating process 140-2 evaluates any rule stored within the rules engine. Finally, upon the condition(s) of an accumulator rule being met, the event and rules correlating process 140-2 triggers the one or more actions defined in the accumulator rule to occur,step 507. - Note that, in some embodiments, an accumulator may have different levels associated with it. For example, an accumulator rule may indicate that, when the value in the accumulator passes a first given value, the events being accumulated may represent a low-level threat, and certain related actions result (i.e., providing an alert to security personnel). When the value in the accumulator passes a second given value, the events being accumulated may represent a mid-level threat, and certain other actions result (i.e., security personnel being summoned to a particular area). As many different types of levels as are desired may be defined, each with their own particular actions. Further, in some embodiments, the event and rules correlating process 140-2 may visualize each level on the graphical user interface 160 (shown in
FIG. 1 ), such as, for example, as a dial or bar, or using a color-coded scheme. - Further, in some embodiments, a plurality of accumulators may be associated with particular geographic locations. That is, a first accumulator may be associated with a first building, a second accumulator associated with a second building, and so on. Alternatively, or additionally, a plurality of accumulators may be associated with a hierarchy of geographic locations. Thus, a first accumulator may be associated with a first floor of a first building, a second accumulator may be associated with a first floor of a second building, while a third accumulator may be associated with the first building and the second building. This allows the event and rules correlating process 140-2 to deal with slightly troubling events that are spread out over a large area. In other words, any single event may contribute to multiple accumulators, including both localized accumulator(s) as well as accumulator(s) associated with a broader area.
- In some embodiments, accumulators may count events themselves, instead of weights associated with events. Thus, an accumulator rule could count, for example, the number of people entering a room, and if that count exceeded a given value defined in the accumulator rule, an action would result.
-
FIG. 6 illustrates the event and rules correlating process 140-2 recognizing a number of events as a situation, and then injecting the situation back into the rules engine for further processing over time. The event and rules correlating process 140-2 first receives security event data via an event pipeline, security event data defining an occurrence of an event, wherein each type of source of event data has its own pipeline,step 601. The event and rules correlating process 140-2 then creates event objects from the received security event data,step 602, and gathers the event objects into a rules engine,step 603, as described herein. The rules engine comprises a plurality of rules, wherein each rule defines one or more conditions to be met and one or more actions to be taken in response. Next, the event and rules correlating process 140-2 evaluates a rule from the plurality of rules within the rules engine at a frequency over a period of time using data contained within event objects, step 604, so to determine whether the one or more conditions defined within the evaluated rule are met. - Prior to any actions being taken, the event and rules correlating process 140-2 may identify, within the rules engine, a situation, wherein a situation is the occurrence of one or more events,
step 606. Thus, a situation is number of events that occur simultaneously or in a span of a period of time. Situations may include any number of events. Events comprising a situation may or may not be related. As discussed further below, a situation may change over time. Further, any number of situations may occur simultaneously, and multiple situations may include some of the same events. - In more detail, the event and rules correlating process 140-2 identifies a situation by detecting that a number of rules have evaluated positively and actions defined in those rules have occurred. That is, the event and rules correlating process 140-2 detects that the conditions for a number of rules, each rule corresponding to an event, the conditions corresponding to the occurrence of the event, have all been met, where the occurrence of those events constitutes the situation. For example, a situation may be a coordinated attempt to rob a bank. The events that make up the situation may include power being cut to video cameras, phone lines being cut, cellular phones being blocked, lights being turned off, incorrect combinations being given to a vault, and so on. The rules engine includes rules that describe all of these events through one or more conditions. When the event and rules correlating process 140-2 detects that all of these conditions for these rules have been met, such that actions defined in the rules are initiated, the event and rules correlating process 140-2 identifies a situation. The situation contains the events described by all the met conditions of the rules.
- The event and rules correlating process 140-2 will then create, in a situation manager, a situation object,
step 607. The situation object includes data from one or more event objects that describe the one or more events. The included data identifies the one or more events that are occurring that make up the situation. Note that the situation object further includes one or more actions to be taken upon occurrence of the situation. - The event and rules correlating process 140-2 injects the situation object from the situation manager into the rules engine,
step 608. By placing the situation object in the rules engine, the event and rules correlating process 140-2 is able to maintain state of all the event data that identifies the events that comprise the situation. That is, a situation tends to change over time. In the bank robbery example given above, the thieves may acquire the correct combination to a vault through coercion or other means, and thus they may be able to open that vault. The event and rules correlating process 140-2 is then able to update situation object, as described below, so that the event data shows that the correct combination was provided and thus the vault is now opened, as opposed to unsuccessful attempts being made to provide the correct combination to the vault. - In more detail, the rules engine will include one or more rules where a first condition to be met is that a situation is occurring, and where a second condition to be met is that one or more further events occur. In such a scenario, for example the bank vault opening scenario described above, upon occurrence of the one or more further events (i.e., the proper combination being provided to the bank vault so that is opens), the event and rules correlating process 140-2 changes the situation object corresponding to the situation by adding the one or more further events to the situation object,
step 609. In some embodiments, events may be deleted from a situation object, or otherwise modified (e.g., updated), instead of, or in addition to, being added. - In addition to changing the situation object by adding one or more further events, the event and rules correlating process 140-2 may also, in response to doing so, further change the situation object by changing the one or more actions to be taken upon occurrence of the situation,
step 610. The change in the one or more actions may be based upon the occurrence of the one or more further events. Thus, continuing to use the bank robbery example, the event and rules correlating process 140-2 may, upon the correct combination being provided to open the vault, change the situation object so that one of the actions taken in response to the situation is to activate alarms and warning lights associated with the vault, despite the vault being opened by the proper combination. The event and rules correlating process 140-2 would thus initiate the one or more actions defined by the situation object whenever the situation is occurring, and as the actions defined therein changed, the event and rules correlating process 140-2 would take appropriate steps to initiated those changes as well. - In
FIG. 7 , the event and rules correlating process 140-2 initiates one or more actions defined in an evaluated rule by sending appropriate commands to control particular devices, and in some embodiments, to intelligently control video devices. The event and rules correlating process 140-2 first receives security event data via an event pipeline, security event data defining an occurrence of an event, wherein each type of source of event data has its own pipeline,step 701. The event and rules correlating process 140-2 then creates event objects from the received security event data,step 702, and gathers the event objects into a rules engine,step 703, as described herein. The rules engine comprises a plurality of rules, wherein each rule defines one or more conditions to be met and one or more actions to be taken in response. Next, the event and rules correlating process 140-2 evaluates a rule from the plurality of rules within the rules engine at a frequency over a period of time using data contained within event objects, step 704, so to determine whether the one or more conditions defined within the evaluated rule are met. Finally, the event and rules correlating process 140-2 initiates the one or more actions defined by the evaluated rule whenever the one or more conditions defined by that rule are met,step 705. Actions may include any of the commands described herein. - In some embodiments, more particularly, the event and rules correlating process 140-2 initiates the one or more actions defined by the evaluated rule whenever the one or more conditions defined by that rule are met, wherein one of the one or more actions includes intelligent controlling of a video device based on received security event data, step 706. In some embodiments, intelligent controlling of a video device may include, for example, allowing a user to use a series of video cameras to track an individual both forwards and backwards through an area under surveillance of the series of video cameras. Such methods and apparatus are disclosed in co-pending U.S. patent application Ser. No. ______, entitled “TRACKING PEOPLE AND OBJECTS USING MULTIPLE LIVE AND RECORDED SURVEILLANCE CAMERA VIDEO FEEDS”, given Attorney Docket No. VID08-03, the entirety of which is hereby incorporated by reference. Alternatively, in other embodiments, intelligent controlling of a video device may include intelligently controlling a sequence of cameras in order to provide a video tour of an area. Such methods and apparatus are disclosed in co-pending U.S. patent application Ser. No. ______, entitled “INTELLIGENT VIDEO TOURS”, given Attorney Docket No. VID08-04, the entirety of which is hereby incorporated by reference. Thus, intelligently controlling a video device uses automated functionality to assist a user in deciding which video device(s) to use and how that (those) device(s) should be used (i.e., the direction in which the device should be aimed, etc.).
- In some embodiments, the event and rules correlating process 140-2 provides a standardized way of controlling a video device, or any other controllable device, that may be programmed by a user. That is, in such embodiments, the event and rules correlating process 140-2 creates an output package to cause one or more types of a device to execute one or more steps, step 707. The event and rules correlating process 140-2 creates an output package for a type of a device using information output by the rules engine and then processed in an output pipeline. An output pipeline is essentially an input pipeline discussed above with regards to
FIGS. 4 and 2B in reverse. Thus, an output pipeline is a textually defined stack used to control external devices, such as access control, lighting systems, etc. Similar to an input pipeline, the event and rules correlating process 140-2 defines a stack that formulates a command message using various layers, the result being a command specific message for an external device (i.e., output package). An example of an output pipeline is shown inFIG. 2C . - As is seen in
FIG. 2C , anoutput action 270, generated by the rules engine (shown inFIG. 2B ), is passed to astack 280. Thestack 280 may contain any number of layers, but in the example shown, contains a normalizelayer 282, aformat layer 284, and a transmitlayer 286, as well as an optional log layer, 288. The event and rules correlating process 140-2, through thestack 280, thus reformulates theoutput action 270 as an output package. - The event and rules correlating process 140-2 first determines, from the
output action 270, the device to which the output package is to be sent, and the one or more steps it is to execute (i.e., commands), from the one or more actions defined by the evaluated rule. This information, because it is generated by the event and rules correlating process 140-2 in the rules engine, is in the canonical form discussed above with regards toFIG. 4 . The device ultimately receiving the output package, of course, does not necessarily understand commands formatted in such a form. Thus, the event and rules correlating process 140-2, in the normalizelayer 282, removes the standardization of the canonical form, and then, in theformat layer 284, formats the information for the particular type of the device that is receiving the output package. That is, for example, the event and rules correlating process 140-2 will format access control system commands (i.e., lock a door/window, unlock a door/window, etc.) according to the particular format of that brand of access control system that is receiving the output package. In other words, all access control systems manufactured by, for example, Honeywell®, will receive output packages containing commands formatted according to Honeywell® specifications, while all access control systems manufactured by Lenel® will receive output packages containing commands formatted according to Lenel® specifications. Alternatively, the event and rules correlating process 140-2 will format lighting system commands (i.e., lights on, lights off, intensity of the light(s), etc.) according to the particular format of the brand of lighting system that is receiving the output package, such that each different type of lighting system that is to receive commands receives its own formatted output package. At the completion of this layer, the event and rules correlating process 140-2 has created one or more output packages and now must send it/them to the appropriate device(s). - The event and rules correlating process 140-2 thus transmits the created output package to the one or more types of the device, where upon receipt of its created output package, each of the one or more types of the device executes the one or more steps included in the output package,
step 708. The event and rules correlating process 140-2 does this through thetransport layer 286 of the output pipeline. Thetransport layer 286 includes all the various transport protocols that may be used by the device(s) receiving the output packages, and uses the particular transport protocol for a particular device to send the output package for that device to that device. Optionally, in some embodiments, the event and rules correlating process 140-2 may use thelog layer 288 to record some or all of the output packages being sent to device(s) for later analysis. - The methods and systems described herein are not limited to a particular hardware or software configuration, and may find applicability in many computing or processing environments. The methods and systems may be implemented in hardware or software, or a combination of hardware and software. The methods and systems may be implemented in one or more computer programs, where a computer program may be understood to include one or more processor executable instructions. The computer program(s) may execute on one or more programmable processors, and may be stored on one or more storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), one or more input devices, and/or one or more output devices. The processor thus may access one or more input devices to obtain input data, and may access one or more output devices to communicate output data. The input and/or output devices may include one or more of the following: Random Access Memory (RAM), Redundant Array of Independent Disks (RAID), floppy drive, CD, DVD, magnetic disk, internal hard drive, external hard drive, memory stick, flash memory (i.e., solid state memory) device, or other storage device capable of being accessed by a processor as provided herein, where such aforementioned examples are not exhaustive, and are for illustration and not limitation.
- The computer program(s) may be implemented using one or more high level procedural or object-oriented programming languages to communicate with a computer system; however, the program(s) may be implemented in assembly or machine language, if desired. The language may be compiled or interpreted.
- As provided herein, the processor(s) may thus be embedded in one or more devices that may be operated independently or together in a networked environment, where the network may include, for example, a Local Area Network (LAN), wide area network (WAN), and/or may include an intranet and/or the internet and/or another network. The network(s) may be wired or wireless or a combination thereof and may use one or more communications protocols to facilitate communications between the different processors. The processors may be configured for distributed processing and may utilize, in some embodiments, a client-server model as needed. Accordingly, the methods and systems may utilize multiple processors and/or processor devices, and the processor instructions may be divided amongst such single- or multiple-processor/devices.
- The device(s) or computer systems that integrate with the processor(s) may include, for example, a personal computer(s), workstation(s) (e.g., Sun, HP), personal digital assistant(s) (PDA(s)), handheld device(s) such as cellular telephone(s), laptop(s), handheld computer(s), or another device(s) capable of being integrated with a processor(s) that may operate as provided herein. Accordingly, the devices provided herein are not exhaustive and are provided for illustration and not limitation.
- References to “a microprocessor” and “a processor”, or “the microprocessor” and “the processor,” may be understood to include one or more microprocessors that may communicate in a stand-alone and/or a distributed environment(s), and may thus be configured to communicate via wired or wireless communications with other processors, where such one or more processor may be configured to operate on one or more processor-controlled devices that may be similar or different devices. Use of such “microprocessor” or “processor” terminology may thus also be understood to include a central processing unit, an arithmetic logic unit, an application-specific integrated circuit (IC), and/or a task engine, with such examples provided for illustration and not limitation.
- Furthermore, references to memory, unless otherwise specified, may include one or more processor-readable and accessible memory elements and/or components that may be internal to the processor-controlled device, external to the processor-controlled device, and/or may be accessed via a wired or wireless network using a variety of communications protocols, and unless otherwise specified, may be arranged to include a combination of external and internal memory devices, where such memory may be contiguous and/or partitioned based on the application. Accordingly, references to a database may be understood to include one or more memory associations, where such references may include commercially available database products (e.g., SQL, Informix, Oracle) and also proprietary databases, and may also include other structures for associating memory such as links, queues, graphs, trees, with such structures provided for illustration and not limitation.
- References to a network, unless provided otherwise, may include one or more intranets and/or the internet. References herein to microprocessor instructions or microprocessor-executable instructions, in accordance with the above, may be understood to include programmable hardware.
- Unless otherwise stated, use of the word “substantially” may be construed to include a precise relationship, condition, arrangement, orientation, and/or other characteristic, and deviations thereof as understood by one of ordinary skill in the art, to the extent that such deviations do not materially affect the disclosed methods and systems.
- Throughout the entirety of the present disclosure, use of the articles “a” or “an” to modify a noun may be understood to be used for convenience and to include one, or more than one of the modified noun, unless otherwise specifically stated.
- Elements, components, modules, and/or parts thereof that are described and/or otherwise portrayed through the figures to communicate with, be associated with, and/or be based on, something else, may be understood to so communicate, be associated with, and or be based on in a direct and/or indirect manner, unless otherwise stipulated herein.
- Although the methods and systems have been described relative to a specific embodiment thereof, they are not so limited. Obviously many modifications and variations may become apparent in light of the above teachings. Those skilled in the art may make many additional changes in the details, materials, and arrangement of parts, herein described and illustrated.
Claims (22)
1. A method of correlating security event data according to one or more rules to initiate one or more actions, the method comprising:
receiving security event data via an event pipeline, security event data defining an occurrence of an event, wherein each type of source of event data may have its own pipeline;
creating event objects from the received security event data;
gathering the event objects into a rules engine, the rules engine comprising a plurality of rules, wherein each rule defines one or more conditions to be met and one or more actions to be taken in response;
evaluating a rule from the plurality of rules within the rules engine at a frequency over a period of time using data contained within event objects, so to determine whether the one or more conditions defined within the evaluated rule are met; and
initiating the one or more actions defined by the evaluated rule whenever the one or more conditions defined by that rule are met.
2. The method of claim 1 wherein creating comprises:
pre-processing the received security event data to create event objects that include the received security event data in a canonical form.
3. The method of claim 2 wherein pre-processing comprises:
supplementing an event object with related data not acquired from the source of the received security event data, by using an item of data within the received security event data to look up the related data, the related data used to set one or more attributes of the event object.
4. The method of claim 1 comprising:
prior to evaluating a rule, injecting one or more facts into rules engine, the one or more facts used to evaluate one or more rules.
5. The method of claim 1 comprising:
assigning weights to a group of rules within the rules engine, such that when the one or more conditions of a rule in the group of rules are met, a value of the weight assigned to that rule is added to an accumulator;
including an accumulator rule in the rules engine, wherein a condition of the accumulator rule is defined as the values in the accumulator exceeding a given value, the accumulator rule defining one or more actions to occur upon the condition being met;
evaluating the accumulator rule at a frequency over a period of time; and
upon the condition of the accumulator rule being met, triggering the one or more actions defined in the accumulator rule to occur.
6. The method of claim 1 comprising:
identifying, within the rules engine, a situation, wherein a situation is the occurrence of one or more events;
creating, in a situation manager, a situation object, the situation object including data from one or more event objects that describe the one or more events, the included data identifying the one or more events that are occurring that make up the situation, the situation further including one or more actions to be taken upon occurrence of the situation; and
injecting the situation object from the situation manager into the rules engine.
7. The method of claim 6 wherein the rules engine includes one or more rules where a first condition to be met is that a situation is occurring, and where a second condition to be met is that one or more further events occur, the method comprising:
upon occurrence of the one or more further events, changing the situation object corresponding to the situation by adding the one or more further events to the situation object.
8. The method of claim 7 comprising:
in response to adding the one or more further events to the situation object, further changing the situation object by changing the one or more actions to be taken upon occurrence of the situation, the change in the one or more actions based upon the occurrence of the one or more further events.
9. The method of claim 1 wherein initiating comprises:
initiating the one or more actions defined by the evaluated rule whenever the one or more conditions defined by that rule are met, wherein one of the one or more actions includes intelligent controlling of a video device based on received security event data.
10. The method of claim 1 wherein initiating comprises:
creating an output package to cause one or more types of a device to execute one or more steps, the device and the one or more steps it is to execute determined from the one or more actions defined by the evaluated rule, wherein the output package is standardized for each type of the device; and
transmitting the created output package to the one or more types of the device, where upon receipt of its created output package, each of the one or more types of the device executes the one or more steps included in the output package.
11. A computer program product, stored on computer readable medium, for correlating security event data according to one or more rules to initiate one or more actions, comprising:
computer program code for receiving security event data via an event pipeline, security event data defining an occurrence of an event, wherein each type of source of event data may have its own pipeline;
computer program code for creating event objects from the received security event data;
computer program code for gathering the event objects into a rules engine, the rules engine comprising a plurality of rules, wherein each rule defines one or more conditions to be met and one or more actions to be taken in response;
computer program code for evaluating a rule from the plurality of rules within the rules engine at a frequency over a period of time using data contained within event objects, so to determine whether the one or more conditions defined within the evaluated rule are met; and
computer program code for initiating the one or more actions defined by the evaluated rule whenever the one or more conditions defined by that rule are met.
12. The computer program product of claim 11 wherein computer program code for creating comprises:
computer program code for pre-processing the received security event data to create event objects that include the received security event data in a canonical form, wherein computer program code for pre-processing comprises computer program code for supplementing an event object with related data not acquired from the source of the received security event data, by using an item of data within the received security event data to look up the related data, the related data used to set one or more attributes of the event object.
13. The computer program product of claim 1 comprising:
computer program code for assigning weights to a group of rules within the rules engine, such that when the one or more conditions of a rule in the group of rules are met, a value of the weight assigned to that rule is added to an accumulator;
computer program code for including an accumulator rule in the rules engine, wherein a condition of the accumulator rule is defined as the values in the accumulator exceeding a given value, the accumulator rule defining one or more actions to occur upon the condition being met;
computer program code for evaluating the accumulator rule at a frequency over a period of time; and
upon the condition of the accumulator rule being met, computer program code for triggering the one or more actions defined in the accumulator rule to occur.
14. The computer program product of claim 1 comprising:
computer program code for identifying, within the rules engine, a situation, wherein a situation is the occurrence of one or more events;
computer program code for creating, in a situation manager, a situation object, the situation object including data from one or more event objects that describe the one or more events, the included data identifying the one or more events that are occurring that make up the situation, the situation further including one or more actions to be taken upon occurrence of the situation; and
computer program code for injecting the situation object from the situation manager into the rules engine.
15. The computer program product of claim 14 wherein the rules engine includes one or more rules where a first condition to be met is that a situation is occurring, and where a second condition to be met is that one or more further events occur, the computer program product comprising:
upon occurrence of the one or more further events, computer program code for changing the situation object corresponding to the situation by adding the one or more further events to the situation object.
16. The computer program product of claim 15 comprising:
in response to adding the one or more further events to the situation object, computer program code for further changing the situation object by changing the one or more actions to be taken upon occurrence of the situation, the change in the one or more actions based upon the occurrence of the one or more further events.
17. A computer system comprising:
a memory;
a processor;
a network interface; and
an interconnection mechanism coupling the memory, the processor, and the network interface, allowing communication there between;
wherein the memory of the computer system is encoded with an event and rules correlating application, that when executed in the processor, provides an event and rules correlating process that correlates security event data according to one or more rules to initiate one or more actions, by causing the computer system to perform operations of:
receiving security event data via an event pipeline, security event data defining an occurrence of an event, wherein each type of source of event data may have its own pipeline;
creating event objects from the received security event data;
gathering the event objects into a rules engine, the rules engine comprising a plurality of rules, wherein each rule defines one or more conditions to be met and one or more actions to be taken in response;
evaluating a rule from the plurality of rules within the rules engine at a frequency over a period of time using data contained within event objects, so to determine whether the one or more conditions defined within the evaluated rule are met; and
initiating the one or more actions defined by the evaluated rule whenever the one or more conditions defined by that rule are met.
18. The computer system of claim 17 wherein creating comprises:
pre-processing the received security event data to create event objects that include the received security event data in a canonical form, wherein pre-processing comprises supplementing an event object with related data not acquired from the source of the received security event data, by using an item of data within the received security event data to look up the related data, the related data used to set one or more attributes of the event object.
19. The computer system of claim 17 comprising:
assigning weights to a group of rules within the rules engine, such that when the one or more conditions of a rule in the group of rules are met, a value of the weight assigned to that rule is added to an accumulator;
including an accumulator rule in the rules engine, wherein a condition of the accumulator rule is defined as the values in the accumulator exceeding a given value, the accumulator rule defining one or more actions to occur upon the condition being met;
evaluating the accumulator rule at a frequency over a period of time; and
upon the condition of the accumulator rule being met, triggering the one or more actions defined in the accumulator rule to occur.
20. The computer system of claim 17 comprising:
identifying, within the rules engine, a situation, wherein a situation is the occurrence of one or more events;
creating, in a situation manager, a situation object, the situation object including data from one or more event objects that describe the one or more events, the included data identifying the one or more events that are occurring that make up the situation, the situation further including one or more actions to be taken upon occurrence of the situation; and
injecting the situation object from the situation manager into the rules engine.
21. The computer system of claim 20 wherein the rules engine includes one or more rules where a first condition to be met is that a situation is occurring, and where a second condition to be met is that one or more further events occur, wherein the computer system performs operations of:
upon occurrence of the one or more further events, changing the situation object corresponding to the situation by adding the one or more further events to the situation object.
22. The computer system of claim 21 comprising:
in response to adding the one or more further events to the situation object, further changing the situation object by changing the one or more actions to be taken upon occurrence of the situation, the change in the one or more actions based upon the occurrence of the one or more further events.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/125,115 US20080294588A1 (en) | 2007-05-22 | 2008-05-22 | Event capture, cross device event correlation, and responsive actions |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US93952107P | 2007-05-22 | 2007-05-22 | |
US93951707P | 2007-05-22 | 2007-05-22 | |
US93952807P | 2007-05-22 | 2007-05-22 | |
US93950307P | 2007-05-22 | 2007-05-22 | |
US12/125,115 US20080294588A1 (en) | 2007-05-22 | 2008-05-22 | Event capture, cross device event correlation, and responsive actions |
Publications (1)
Publication Number | Publication Date |
---|---|
US20080294588A1 true US20080294588A1 (en) | 2008-11-27 |
Family
ID=40072426
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/125,122 Active 2031-05-12 US8350908B2 (en) | 2007-05-22 | 2008-05-22 | Tracking people and objects using multiple live and recorded surveillance camera video feeds |
US12/125,134 Active 2031-11-19 US8527655B2 (en) | 2007-05-22 | 2008-05-22 | Optimal routing of audio, video, and control data through heterogeneous networks |
US12/125,115 Abandoned US20080294588A1 (en) | 2007-05-22 | 2008-05-22 | Event capture, cross device event correlation, and responsive actions |
US12/125,124 Active 2030-05-23 US8356249B2 (en) | 2007-05-22 | 2008-05-22 | Intelligent video tours |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/125,122 Active 2031-05-12 US8350908B2 (en) | 2007-05-22 | 2008-05-22 | Tracking people and objects using multiple live and recorded surveillance camera video feeds |
US12/125,134 Active 2031-11-19 US8527655B2 (en) | 2007-05-22 | 2008-05-22 | Optimal routing of audio, video, and control data through heterogeneous networks |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/125,124 Active 2030-05-23 US8356249B2 (en) | 2007-05-22 | 2008-05-22 | Intelligent video tours |
Country Status (2)
Country | Link |
---|---|
US (4) | US8350908B2 (en) |
WO (4) | WO2008147915A2 (en) |
Cited By (94)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060093190A1 (en) * | 2004-09-17 | 2006-05-04 | Proximex Corporation | Adaptive multi-modal integrated biometric identification detection and surveillance systems |
US20100135157A1 (en) * | 2008-12-02 | 2010-06-03 | Kim Sang Wan | Method and apparatus for controlling traffic according to user |
US20100138842A1 (en) * | 2008-12-03 | 2010-06-03 | Soren Balko | Multithreading And Concurrency Control For A Rule-Based Transaction Engine |
US20120311562A1 (en) * | 2011-06-01 | 2012-12-06 | Yanlin Wang | Extendable event processing |
US8527340B2 (en) | 2011-03-07 | 2013-09-03 | Kba2, Inc. | Systems and methods for analytic data gathering from image providers at an event or geographic location |
US8600167B2 (en) | 2010-05-21 | 2013-12-03 | Hand Held Products, Inc. | System for capturing a document in an image signal |
US8628016B2 (en) | 2011-06-17 | 2014-01-14 | Hand Held Products, Inc. | Terminal operative for storing frame of image data |
US20140337354A1 (en) * | 2012-08-17 | 2014-11-13 | Splunk Inc. | Indexing Preview |
US9047531B2 (en) | 2010-05-21 | 2015-06-02 | Hand Held Products, Inc. | Interactive user interface for capturing a document in an image signal |
US9264474B2 (en) | 2013-05-07 | 2016-02-16 | KBA2 Inc. | System and method of portraying the shifting level of interest in an object or location |
WO2016060741A1 (en) * | 2014-10-15 | 2016-04-21 | Ayla Networks, Inc. | Flexible rules engine for managing connected consumer devices |
US20160125247A1 (en) * | 2014-11-05 | 2016-05-05 | Vivotek Inc. | Surveillance system and surveillance method |
US9544563B1 (en) | 2007-03-23 | 2017-01-10 | Proximex Corporation | Multi-video navigation system |
US9544496B1 (en) | 2007-03-23 | 2017-01-10 | Proximex Corporation | Multi-video navigation |
US9558507B2 (en) | 2012-06-11 | 2017-01-31 | Retailmenot, Inc. | Reminding users of offers |
US9639853B2 (en) | 2012-06-11 | 2017-05-02 | Retailmenot, Inc. | Devices, methods, and computer-readable media for redemption header for merchant offers |
US20170178026A1 (en) * | 2015-12-22 | 2017-06-22 | Sap Se | Log normalization in enterprise threat detection |
US10083359B2 (en) * | 2016-04-28 | 2018-09-25 | Motorola Solutions, Inc. | Method and device for incident situation prediction |
US20180278650A1 (en) * | 2014-09-14 | 2018-09-27 | Sophos Limited | Normalized indications of compromise |
US20190026564A1 (en) * | 2017-07-19 | 2019-01-24 | Pegatron Corporation | Video surveillance system and video surveillance method |
US10523689B2 (en) | 2007-06-12 | 2019-12-31 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US10559193B2 (en) | 2002-02-01 | 2020-02-11 | Comcast Cable Communications, Llc | Premises management systems |
US10616075B2 (en) | 2007-06-12 | 2020-04-07 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US10616244B2 (en) | 2006-06-12 | 2020-04-07 | Icontrol Networks, Inc. | Activation of gateway device |
US20200134741A1 (en) * | 2018-02-20 | 2020-04-30 | Osram Gmbh | Controlled Agricultural Systems and Methods of Managing Agricultural Systems |
US10657794B1 (en) | 2007-02-28 | 2020-05-19 | Icontrol Networks, Inc. | Security, monitoring and automation controller access and use of legacy security control panel information |
US10666523B2 (en) | 2007-06-12 | 2020-05-26 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US10672254B2 (en) | 2007-04-23 | 2020-06-02 | Icontrol Networks, Inc. | Method and system for providing alternate network access |
US10691295B2 (en) | 2004-03-16 | 2020-06-23 | Icontrol Networks, Inc. | User interface in a premises network |
US10692356B2 (en) | 2004-03-16 | 2020-06-23 | Icontrol Networks, Inc. | Control system user interface |
US10721087B2 (en) | 2005-03-16 | 2020-07-21 | Icontrol Networks, Inc. | Method for networked touchscreen with integrated interfaces |
US10735249B2 (en) | 2004-03-16 | 2020-08-04 | Icontrol Networks, Inc. | Management of a security system at a premises |
US10741057B2 (en) | 2010-12-17 | 2020-08-11 | Icontrol Networks, Inc. | Method and system for processing security event data |
US10747216B2 (en) | 2007-02-28 | 2020-08-18 | Icontrol Networks, Inc. | Method and system for communicating with and controlling an alarm system from a remote server |
US10754304B2 (en) | 2004-03-16 | 2020-08-25 | Icontrol Networks, Inc. | Automation system with mobile interface |
US10785319B2 (en) | 2006-06-12 | 2020-09-22 | Icontrol Networks, Inc. | IP device discovery systems and methods |
US10796557B2 (en) | 2004-03-16 | 2020-10-06 | Icontrol Networks, Inc. | Automation system user interface with three-dimensional display |
US10841381B2 (en) | 2005-03-16 | 2020-11-17 | Icontrol Networks, Inc. | Security system with networked touchscreen |
US10930136B2 (en) | 2005-03-16 | 2021-02-23 | Icontrol Networks, Inc. | Premise management systems and methods |
US10979389B2 (en) | 2004-03-16 | 2021-04-13 | Icontrol Networks, Inc. | Premises management configuration and control |
US10992784B2 (en) | 2004-03-16 | 2021-04-27 | Control Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US10999254B2 (en) | 2005-03-16 | 2021-05-04 | Icontrol Networks, Inc. | System for data routing in networks |
US11043112B2 (en) | 2004-03-16 | 2021-06-22 | Icontrol Networks, Inc. | Integrated security system with parallel processing architecture |
US11089122B2 (en) | 2007-06-12 | 2021-08-10 | Icontrol Networks, Inc. | Controlling data routing among networks |
US11113950B2 (en) | 2005-03-16 | 2021-09-07 | Icontrol Networks, Inc. | Gateway integrated with premises security system |
WO2021190284A1 (en) * | 2020-03-27 | 2021-09-30 | 杭州海康威视数字技术股份有限公司 | Method, system, and device for determining action |
US11146637B2 (en) | 2014-03-03 | 2021-10-12 | Icontrol Networks, Inc. | Media content management |
US11153266B2 (en) | 2004-03-16 | 2021-10-19 | Icontrol Networks, Inc. | Gateway registry methods and systems |
US11182060B2 (en) | 2004-03-16 | 2021-11-23 | Icontrol Networks, Inc. | Networked touchscreen with integrated interfaces |
US11184322B2 (en) | 2004-03-16 | 2021-11-23 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11190578B2 (en) | 2008-08-11 | 2021-11-30 | Icontrol Networks, Inc. | Integrated cloud system with lightweight gateway for premises automation |
US11201755B2 (en) | 2004-03-16 | 2021-12-14 | Icontrol Networks, Inc. | Premises system management using status signal |
US11212192B2 (en) | 2007-06-12 | 2021-12-28 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11218878B2 (en) | 2007-06-12 | 2022-01-04 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11237714B2 (en) | 2007-06-12 | 2022-02-01 | Control Networks, Inc. | Control system user interface |
US11240059B2 (en) | 2010-12-20 | 2022-02-01 | Icontrol Networks, Inc. | Defining and implementing sensor triggered response rules |
US11244545B2 (en) | 2004-03-16 | 2022-02-08 | Icontrol Networks, Inc. | Cross-client sensor user interface in an integrated security network |
US11252168B2 (en) | 2015-12-22 | 2022-02-15 | Sap Se | System and user context in enterprise threat detection |
US11258625B2 (en) | 2008-08-11 | 2022-02-22 | Icontrol Networks, Inc. | Mobile premises automation platform |
US11277465B2 (en) | 2004-03-16 | 2022-03-15 | Icontrol Networks, Inc. | Generating risk profile using data of home monitoring and security system |
US11296950B2 (en) | 2013-06-27 | 2022-04-05 | Icontrol Networks, Inc. | Control system user interface |
US11310199B2 (en) | 2004-03-16 | 2022-04-19 | Icontrol Networks, Inc. | Premises management configuration and control |
WO2022081720A1 (en) * | 2020-10-16 | 2022-04-21 | Splunk Inc. | Rule-based data stream processing |
US11316958B2 (en) | 2008-08-11 | 2022-04-26 | Icontrol Networks, Inc. | Virtual device systems and methods |
US11316753B2 (en) | 2007-06-12 | 2022-04-26 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11343380B2 (en) | 2004-03-16 | 2022-05-24 | Icontrol Networks, Inc. | Premises system automation |
US11368327B2 (en) | 2008-08-11 | 2022-06-21 | Icontrol Networks, Inc. | Integrated cloud system for premises automation |
US11398147B2 (en) | 2010-09-28 | 2022-07-26 | Icontrol Networks, Inc. | Method, system and apparatus for automated reporting of account and sensor zone information to a central station |
US11405463B2 (en) | 2014-03-03 | 2022-08-02 | Icontrol Networks, Inc. | Media content management |
US11412027B2 (en) | 2007-01-24 | 2022-08-09 | Icontrol Networks, Inc. | Methods and systems for data communication |
US11424980B2 (en) | 2005-03-16 | 2022-08-23 | Icontrol Networks, Inc. | Forming a security network including integrated security system components |
US11423756B2 (en) | 2007-06-12 | 2022-08-23 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11451409B2 (en) | 2005-03-16 | 2022-09-20 | Icontrol Networks, Inc. | Security network integrating security system and network devices |
US11489812B2 (en) | 2004-03-16 | 2022-11-01 | Icontrol Networks, Inc. | Forming a security network including integrated security system components and network devices |
US11496568B2 (en) | 2005-03-16 | 2022-11-08 | Icontrol Networks, Inc. | Security system with networked touchscreen |
US11582065B2 (en) | 2007-06-12 | 2023-02-14 | Icontrol Networks, Inc. | Systems and methods for device communication |
US11601810B2 (en) | 2007-06-12 | 2023-03-07 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11615697B2 (en) | 2005-03-16 | 2023-03-28 | Icontrol Networks, Inc. | Premise management systems and methods |
US11646907B2 (en) | 2007-06-12 | 2023-05-09 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11677577B2 (en) | 2004-03-16 | 2023-06-13 | Icontrol Networks, Inc. | Premises system management using status signal |
US11700142B2 (en) | 2005-03-16 | 2023-07-11 | Icontrol Networks, Inc. | Security network integrating security system and network devices |
US11706279B2 (en) | 2007-01-24 | 2023-07-18 | Icontrol Networks, Inc. | Methods and systems for data communication |
US11706045B2 (en) | 2005-03-16 | 2023-07-18 | Icontrol Networks, Inc. | Modular electronic display platform |
US11729255B2 (en) | 2008-08-11 | 2023-08-15 | Icontrol Networks, Inc. | Integrated cloud system with lightweight gateway for premises automation |
US11750414B2 (en) | 2010-12-16 | 2023-09-05 | Icontrol Networks, Inc. | Bidirectional security sensor communication for a premises security system |
US11758026B2 (en) | 2008-08-11 | 2023-09-12 | Icontrol Networks, Inc. | Virtual device systems and methods |
RU2805125C1 (en) * | 2020-03-27 | 2023-10-11 | Ханчжоу Хиквижен Диджитал Текнолоджи Ко., Лтд. | Method, system and device for determining action |
US11792036B2 (en) | 2008-08-11 | 2023-10-17 | Icontrol Networks, Inc. | Mobile premises automation platform |
US11792330B2 (en) | 2005-03-16 | 2023-10-17 | Icontrol Networks, Inc. | Communication and automation in a premises management system |
US11811845B2 (en) | 2004-03-16 | 2023-11-07 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US11816323B2 (en) | 2008-06-25 | 2023-11-14 | Icontrol Networks, Inc. | Automation system user interface |
US11831462B2 (en) | 2007-08-24 | 2023-11-28 | Icontrol Networks, Inc. | Controlling data routing in premises management systems |
US11916870B2 (en) | 2004-03-16 | 2024-02-27 | Icontrol Networks, Inc. | Gateway registry methods and systems |
US11916928B2 (en) | 2008-01-24 | 2024-02-27 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
Families Citing this family (89)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8711217B2 (en) | 2000-10-24 | 2014-04-29 | Objectvideo, Inc. | Video surveillance system employing video primitives |
US8564661B2 (en) | 2000-10-24 | 2013-10-22 | Objectvideo, Inc. | Video analytic rule detection system and method |
US9892606B2 (en) | 2001-11-15 | 2018-02-13 | Avigilon Fortress Corporation | Video surveillance system employing video primitives |
US7424175B2 (en) | 2001-03-23 | 2008-09-09 | Objectvideo, Inc. | Video segmentation using statistical pixel modeling |
JP2009533778A (en) | 2006-04-17 | 2009-09-17 | オブジェクトビデオ インコーポレイテッド | Video segmentation using statistical pixel modeling |
US20090213123A1 (en) * | 2007-12-08 | 2009-08-27 | Dennis Allard Crow | Method of using skeletal animation data to ascertain risk in a surveillance system |
KR101626038B1 (en) * | 2008-08-12 | 2016-05-31 | 구글 인코포레이티드 | Touring in a geographic information system |
US20100097472A1 (en) * | 2008-10-21 | 2010-04-22 | Honeywell International Inc. | Method of efficient camera control and hand over in surveillance management |
EP2352288B1 (en) * | 2008-11-25 | 2018-07-18 | ZTE Corporation | Method for transmitting and receiving the service data of handset tv |
JP5230793B2 (en) * | 2009-02-24 | 2013-07-10 | 三菱電機株式会社 | Person tracking device and person tracking program |
US20100235210A1 (en) * | 2009-03-11 | 2010-09-16 | United Parcel Service Of America, Inc. | Scheduled delivery service systems, apparatuses, methods, and computer programs embodied on computer-readable media |
US9190110B2 (en) | 2009-05-12 | 2015-11-17 | JBF Interlude 2009 LTD | System and method for assembling a recorded composition |
US20110002548A1 (en) * | 2009-07-02 | 2011-01-06 | Honeywell International Inc. | Systems and methods of video navigation |
US20110175999A1 (en) * | 2010-01-15 | 2011-07-21 | Mccormack Kenneth | Video system and method for operating same |
US20110181716A1 (en) * | 2010-01-22 | 2011-07-28 | Crime Point, Incorporated | Video surveillance enhancement facilitating real-time proactive decision making |
US11232458B2 (en) | 2010-02-17 | 2022-01-25 | JBF Interlude 2009 LTD | System and method for data mining within interactive multimedia |
US9082278B2 (en) * | 2010-03-19 | 2015-07-14 | University-Industry Cooperation Group Of Kyung Hee University | Surveillance system |
WO2011116476A1 (en) * | 2010-03-26 | 2011-09-29 | Feeling Software Inc. | Effortless navigation across cameras and cooperative control of cameras |
FR2958822B1 (en) * | 2010-04-09 | 2012-04-13 | Canon Kk | METHODS OF TRANSMITTING AND RECEIVING DATA CONTENTS, SOURCE NUTS AND DESTINATION, COMPUTER PROGRAM PRODUCT AND CORRESPONDING STORAGE MEDIUM |
WO2010086462A2 (en) * | 2010-05-04 | 2010-08-05 | Phonak Ag | Methods for operating a hearing device as well as hearing devices |
US20110285851A1 (en) * | 2010-05-20 | 2011-11-24 | Honeywell International Inc. | Intruder situation awareness system |
US9118593B2 (en) | 2010-10-07 | 2015-08-25 | Enghouse Networks Limited | System and method for best value routing |
US9384587B2 (en) * | 2010-11-29 | 2016-07-05 | Verizon Patent And Licensing Inc. | Virtual event viewing |
US8773499B2 (en) | 2011-06-24 | 2014-07-08 | Microsoft Corporation | Automatic video framing |
US8862995B1 (en) * | 2011-11-04 | 2014-10-14 | Google Inc. | Automatically creating a movie from geo located content using earth |
TWI450207B (en) * | 2011-12-26 | 2014-08-21 | Ind Tech Res Inst | Method, system, computer program product and computer-readable recording medium for object tracking |
US9197864B1 (en) | 2012-01-06 | 2015-11-24 | Google Inc. | Zoom and image capture based on features of interest |
US8941561B1 (en) | 2012-01-06 | 2015-01-27 | Google Inc. | Image capture |
CN102799618B (en) * | 2012-06-21 | 2017-12-15 | 刘伟 | A kind of vehicle retrieval method in multiple monitoring videos |
US8947536B2 (en) * | 2012-07-13 | 2015-02-03 | International Business Machines Corporation | Automatic failover video coverage of digital video sensing and recording devices |
MX2015003266A (en) * | 2012-09-13 | 2016-02-05 | Curtis J Krull | Method and system for training sports officials. |
US20140287391A1 (en) * | 2012-09-13 | 2014-09-25 | Curt Krull | Method and system for training athletes |
US11412020B2 (en) * | 2012-10-19 | 2022-08-09 | Parallel Wireless, Inc. | Wireless broadband network with integrated streaming multimedia services |
US9087386B2 (en) * | 2012-11-30 | 2015-07-21 | Vidsys, Inc. | Tracking people and objects using multiple live and recorded surveillance camera video feeds |
US10725996B1 (en) * | 2012-12-18 | 2020-07-28 | EMC IP Holding Company LLC | Method and system for determining differing file path hierarchies for backup file paths |
US10931920B2 (en) * | 2013-03-14 | 2021-02-23 | Pelco, Inc. | Auto-learning smart tours for video surveillance |
JP5506990B1 (en) | 2013-07-11 | 2014-05-28 | パナソニック株式会社 | Tracking support device, tracking support system, and tracking support method |
JP5438861B1 (en) | 2013-07-11 | 2014-03-12 | パナソニック株式会社 | Tracking support device, tracking support system, and tracking support method |
GB2517944A (en) | 2013-09-05 | 2015-03-11 | Ibm | Locating objects using images from portable devices |
US9807574B2 (en) | 2013-10-03 | 2017-10-31 | Parallel Wireless, Inc. | Multicast and broadcast services over a mesh network |
US9875504B1 (en) | 2014-02-16 | 2018-01-23 | Evan Gates Roe | Real-time video streaming of marine life for sale |
US9397947B2 (en) | 2014-03-11 | 2016-07-19 | International Business Machines Corporation | Quality of experience for communication sessions |
US9653115B2 (en) | 2014-04-10 | 2017-05-16 | JBF Interlude 2009 LTD | Systems and methods for creating linear video from branched video |
US9792957B2 (en) | 2014-10-08 | 2017-10-17 | JBF Interlude 2009 LTD | Systems and methods for dynamic video bookmarking |
US11412276B2 (en) | 2014-10-10 | 2022-08-09 | JBF Interlude 2009 LTD | Systems and methods for parallel track transitions |
KR102282456B1 (en) * | 2014-12-05 | 2021-07-28 | 한화테크윈 주식회사 | Device and Method for displaying heatmap on the floor plan |
US9237307B1 (en) * | 2015-01-30 | 2016-01-12 | Ringcentral, Inc. | System and method for dynamically selecting networked cameras in a video conference |
WO2016133234A1 (en) * | 2015-02-17 | 2016-08-25 | 이노뎁 주식회사 | Image analysis system for analyzing dynamically allocated camera image, integrated control system including same, and operation method therefor |
JP5999394B2 (en) * | 2015-02-20 | 2016-09-28 | パナソニックIpマネジメント株式会社 | Tracking support device, tracking support system, and tracking support method |
US9767564B2 (en) | 2015-08-14 | 2017-09-19 | International Business Machines Corporation | Monitoring of object impressions and viewing patterns |
US10460765B2 (en) | 2015-08-26 | 2019-10-29 | JBF Interlude 2009 LTD | Systems and methods for adaptive and responsive video |
US10004019B2 (en) | 2015-09-08 | 2018-06-19 | Parallel Wireless, Inc. | RAN for multimedia delivery |
US10887239B2 (en) | 2015-09-08 | 2021-01-05 | Parallel Wireless, Inc. | RAN for multimedia delivery |
US11164548B2 (en) * | 2015-12-22 | 2021-11-02 | JBF Interlude 2009 LTD | Intelligent buffering of large-scale video |
US11258985B2 (en) * | 2016-04-05 | 2022-02-22 | Verint Systems Inc. | Target tracking in a multi-camera surveillance system |
US11856271B2 (en) | 2016-04-12 | 2023-12-26 | JBF Interlude 2009 LTD | Symbiotic interactive video |
EP3291191B1 (en) * | 2016-08-29 | 2019-10-09 | Panasonic Intellectual Property Management Co., Ltd. | Suspicious person report system and suspicious person report method |
US10489659B2 (en) | 2016-09-07 | 2019-11-26 | Verint Americas Inc. | System and method for searching video |
US11050809B2 (en) | 2016-12-30 | 2021-06-29 | JBF Interlude 2009 LTD | Systems and methods for dynamic weighting of branched video paths |
US10679669B2 (en) | 2017-01-18 | 2020-06-09 | Microsoft Technology Licensing, Llc | Automatic narration of signal segment |
US10482900B2 (en) | 2017-01-18 | 2019-11-19 | Microsoft Technology Licensing, Llc | Organization of signal segments supporting sensed features |
US10437884B2 (en) | 2017-01-18 | 2019-10-08 | Microsoft Technology Licensing, Llc | Navigation of computer-navigable physical feature graph |
US11094212B2 (en) | 2017-01-18 | 2021-08-17 | Microsoft Technology Licensing, Llc | Sharing signal segments of physical graph |
US10635981B2 (en) * | 2017-01-18 | 2020-04-28 | Microsoft Technology Licensing, Llc | Automated movement orchestration |
US10606814B2 (en) | 2017-01-18 | 2020-03-31 | Microsoft Technology Licensing, Llc | Computer-aided tracking of physical entities |
US10637814B2 (en) | 2017-01-18 | 2020-04-28 | Microsoft Technology Licensing, Llc | Communication routing based on physical status |
US11494729B1 (en) * | 2017-03-27 | 2022-11-08 | Amazon Technologies, Inc. | Identifying user-item interactions in an automated facility |
US11087271B1 (en) | 2017-03-27 | 2021-08-10 | Amazon Technologies, Inc. | Identifying user-item interactions in an automated facility |
US11238401B1 (en) | 2017-03-27 | 2022-02-01 | Amazon Technologies, Inc. | Identifying user-item interactions in an automated facility |
US10094662B1 (en) | 2017-03-28 | 2018-10-09 | Trimble Inc. | Three-dimension position and heading solution |
US10406645B2 (en) | 2017-05-24 | 2019-09-10 | Trimble Inc. | Calibration approach for camera placement |
US10341618B2 (en) | 2017-05-24 | 2019-07-02 | Trimble Inc. | Infrastructure positioning camera system |
US10300573B2 (en) | 2017-05-24 | 2019-05-28 | Trimble Inc. | Measurement, layout, marking, firestop stick |
US10347008B2 (en) | 2017-08-14 | 2019-07-09 | Trimble Inc. | Self positioning camera system to 3D CAD/BIM model |
US10339670B2 (en) * | 2017-08-29 | 2019-07-02 | Trimble Inc. | 3D tool tracking and positioning using cameras |
US10257578B1 (en) | 2018-01-05 | 2019-04-09 | JBF Interlude 2009 LTD | Dynamic library display for interactive videos |
US10791047B2 (en) * | 2018-02-19 | 2020-09-29 | Disney Enterprise Inc. | Automated network navigation |
US11097921B2 (en) * | 2018-04-10 | 2021-08-24 | International Business Machines Corporation | Elevator movement plan generation |
US10951859B2 (en) | 2018-05-30 | 2021-03-16 | Microsoft Technology Licensing, Llc | Videoconferencing device and method |
US11601721B2 (en) | 2018-06-04 | 2023-03-07 | JBF Interlude 2009 LTD | Interactive video dynamic adaptation and user profiling |
WO2020055660A1 (en) | 2018-09-13 | 2020-03-19 | Carrier Corporation | Space determination with boundary visualization |
US20200250736A1 (en) * | 2019-02-05 | 2020-08-06 | Adroit Worldwide Media, Inc. | Systems, method and apparatus for frictionless shopping |
US10997747B2 (en) | 2019-05-09 | 2021-05-04 | Trimble Inc. | Target positioning with bundle adjustment |
US11002541B2 (en) | 2019-07-23 | 2021-05-11 | Trimble Inc. | Target positioning with electronic distance measuring and bundle adjustment |
US11490047B2 (en) | 2019-10-02 | 2022-11-01 | JBF Interlude 2009 LTD | Systems and methods for dynamically adjusting video aspect ratios |
WO2021150305A2 (en) | 2019-11-27 | 2021-07-29 | Vivonics, Inc. | Contactless system and method for assessing and/or determining hemodynamic parameters and/or vital signs |
US11245961B2 (en) | 2020-02-18 | 2022-02-08 | JBF Interlude 2009 LTD | System and methods for detecting anomalous activities for interactive videos |
CN111601148B (en) * | 2020-05-29 | 2022-02-22 | 广州酷狗计算机科技有限公司 | Information display method, device, terminal and storage medium |
US11882337B2 (en) | 2021-05-28 | 2024-01-23 | JBF Interlude 2009 LTD | Automated platform for generating interactive videos |
Citations (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20010029613A1 (en) * | 1998-03-19 | 2001-10-11 | Fernandez Dennis Sunga | Integrated network for monitoring remote objects |
US20010039579A1 (en) * | 1996-11-06 | 2001-11-08 | Milan V. Trcka | Network security and surveillance system |
US6408404B1 (en) * | 1998-07-29 | 2002-06-18 | Northrop Grumman Corporation | System and method for ensuring and managing situation awareness |
US6604093B1 (en) * | 1999-12-27 | 2003-08-05 | International Business Machines Corporation | Situation awareness system |
US20040105006A1 (en) * | 2002-12-03 | 2004-06-03 | Lazo Philip A. | Event driven video tracking system |
US20040143602A1 (en) * | 2002-10-18 | 2004-07-22 | Antonio Ruiz | Apparatus, system and method for automated and adaptive digital image/video surveillance for events and configurations using a rich multimedia relational database |
US20040223054A1 (en) * | 2003-05-06 | 2004-11-11 | Rotholtz Ben Aaron | Multi-purpose video surveillance |
US20050110637A1 (en) * | 2003-11-26 | 2005-05-26 | International Business Machines Corporation | System and method for alarm generation based on the detection of the presence of a person |
US6906709B1 (en) * | 2001-02-27 | 2005-06-14 | Applied Visions, Inc. | Visualizing security incidents in a computer network |
US20050132414A1 (en) * | 2003-12-02 | 2005-06-16 | Connexed, Inc. | Networked video surveillance system |
US20050162515A1 (en) * | 2000-10-24 | 2005-07-28 | Objectvideo, Inc. | Video surveillance system |
US20050222811A1 (en) * | 2004-04-03 | 2005-10-06 | Altusys Corp | Method and Apparatus for Context-Sensitive Event Correlation with External Control in Situation-Based Management |
US20060066722A1 (en) * | 2004-09-28 | 2006-03-30 | Objectvideo, Inc. | View handling in video surveillance systems |
US20060262958A1 (en) * | 2005-05-19 | 2006-11-23 | Objectvideo, Inc. | Periodic motion detection with applications to multi-grabbing |
US20070035622A1 (en) * | 2004-06-01 | 2007-02-15 | Hanna Keith J | Method and apparatus for video surveillance |
US20070282665A1 (en) * | 2006-06-02 | 2007-12-06 | Buehler Christopher J | Systems and methods for providing video surveillance data |
US7380279B2 (en) * | 2001-07-16 | 2008-05-27 | Lenel Systems International, Inc. | System for integrating security and access for facilities and information systems |
US20080184245A1 (en) * | 2007-01-30 | 2008-07-31 | March Networks Corporation | Method and system for task-based video analytics processing |
US20080198159A1 (en) * | 2007-02-16 | 2008-08-21 | Matsushita Electric Industrial Co., Ltd. | Method and apparatus for efficient and flexible surveillance visualization with context sensitive privacy preserving and power lens data mining |
US20080209505A1 (en) * | 2006-08-14 | 2008-08-28 | Quantum Secure, Inc. | Policy-based physical security system for restricting access to computer resources and data flow through network equipment |
US20080208828A1 (en) * | 2005-03-21 | 2008-08-28 | Oren Boiman | Detecting Irregularities |
US7467400B1 (en) * | 2003-02-14 | 2008-12-16 | S2 Security Corporation | Integrated security system having network enabled access control and interface devices |
US7599957B2 (en) * | 2006-02-15 | 2009-10-06 | Panasonic Corporation | System and method for high performance template driven metadata schema mapping and data storage for surveillance and sensor devices |
US7768548B2 (en) * | 2005-08-12 | 2010-08-03 | William Bradford Silvernail | Mobile digital video recording system |
US7840515B2 (en) * | 2007-02-16 | 2010-11-23 | Panasonic Corporation | System architecture and process for automating intelligent surveillance center operations |
US7864980B2 (en) * | 2002-08-15 | 2011-01-04 | Roke Manor Research Limited | Video motion anomaly detector |
US7868912B2 (en) * | 2000-10-24 | 2011-01-11 | Objectvideo, Inc. | Video surveillance system employing video primitives |
US8176527B1 (en) * | 2002-12-02 | 2012-05-08 | Hewlett-Packard Development Company, L. P. | Correlation engine with support for time-based rules |
US8195598B2 (en) * | 2007-11-16 | 2012-06-05 | Agilence, Inc. | Method of and system for hierarchical human/crowd behavior detection |
Family Cites Families (67)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4992866A (en) * | 1989-06-29 | 1991-02-12 | Morgan Jack B | Camera selection and positioning system and method |
US5258837A (en) * | 1991-01-07 | 1993-11-02 | Zandar Research Limited | Multiple security video display |
CA2081040A1 (en) * | 1993-04-13 | 1994-10-14 | Nizar Ladha | Alarm panel with cellular telephone backup |
US6768563B1 (en) * | 1995-02-24 | 2004-07-27 | Canon Kabushiki Kaisha | Image input system |
US7116357B1 (en) * | 1995-03-20 | 2006-10-03 | Canon Kabushiki Kaisha | Camera monitoring system |
US5729471A (en) * | 1995-03-31 | 1998-03-17 | The Regents Of The University Of California | Machine dynamic selection of one video camera/image of a scene from multiple video cameras/images of the scene in accordance with a particular perspective on the scene, an object in the scene, or an event in the scene |
US6304549B1 (en) * | 1996-09-12 | 2001-10-16 | Lucent Technologies Inc. | Virtual path management in hierarchical ATM networks |
US5875305A (en) * | 1996-10-31 | 1999-02-23 | Sensormatic Electronics Corporation | Video information management system which provides intelligent responses to video data content features |
KR100194608B1 (en) | 1996-11-20 | 1999-06-15 | 이계철 | Multicast Path Allocation Method in ATM Networks |
US6359647B1 (en) * | 1998-08-07 | 2002-03-19 | Philips Electronics North America Corporation | Automated camera handoff system for figure tracking in a multiple camera system |
US20020097322A1 (en) * | 2000-11-29 | 2002-07-25 | Monroe David A. | Multiple video display configurations and remote control of multiple video signals transmitted to a monitoring station over a network |
US6314204B1 (en) | 1998-11-03 | 2001-11-06 | Compaq Computer Corporation | Multiple mode probability density estimation with application to multiple hypothesis tracking |
AU761072C (en) * | 1998-11-23 | 2003-07-10 | Nestor, Inc. | Traffic light violation prediction and recording system |
US6437819B1 (en) * | 1999-06-25 | 2002-08-20 | Rohan Christopher Loveland | Automated video person tracking system |
US6476858B1 (en) * | 1999-08-12 | 2002-11-05 | Innovation Institute | Video monitoring and security system |
US7310111B2 (en) * | 1999-08-12 | 2007-12-18 | Innovation Institute | Video monitoring and security system |
US6556960B1 (en) | 1999-09-01 | 2003-04-29 | Microsoft Corporation | Variational inference engine for probabilistic graphical models |
US6553515B1 (en) * | 1999-09-10 | 2003-04-22 | Comdial Corporation | System, method and computer program product for diagnostic supervision of internet connections |
GB9929870D0 (en) * | 1999-12-18 | 2000-02-09 | Roke Manor Research | Improvements in or relating to security camera systems |
EP1968321B1 (en) * | 2000-02-28 | 2015-06-17 | Hitachi Kokusai Electric Inc. | Intruding object monitoring method and intruding object monitoring system |
US7522186B2 (en) * | 2000-03-07 | 2009-04-21 | L-3 Communications Corporation | Method and apparatus for providing immersive surveillance |
US6970942B1 (en) | 2000-05-08 | 2005-11-29 | Crossroads Systems, Inc. | Method of routing HTTP and FTP services across heterogeneous networks |
AU2001264723A1 (en) * | 2000-05-18 | 2001-11-26 | Imove Inc. | Multiple camera video system which displays selected images |
US20040125877A1 (en) * | 2000-07-17 | 2004-07-01 | Shin-Fu Chang | Method and system for indexing and content-based adaptive streaming of digital video content |
US6807361B1 (en) * | 2000-07-18 | 2004-10-19 | Fuji Xerox Co., Ltd. | Interactive custom video creation system |
US7058204B2 (en) * | 2000-10-03 | 2006-06-06 | Gesturetek, Inc. | Multiple camera control system |
WO2002065763A2 (en) * | 2001-02-12 | 2002-08-22 | Carnegie Mellon University | System and method for manipulating the point of interest in a sequence of images |
US7106333B1 (en) * | 2001-02-16 | 2006-09-12 | Vistascape Security Systems Corp. | Surveillance system |
JP2002251608A (en) * | 2001-02-23 | 2002-09-06 | Mixed Reality Systems Laboratory Inc | Device and method for controlling imaging device, device and method for image processing, program code, and storage medium |
EP1410621A1 (en) * | 2001-06-28 | 2004-04-21 | Omnivee Inc. | Method and apparatus for control and processing of video images |
IL144141A0 (en) * | 2001-07-04 | 2002-05-23 | Method and system for improving a route along which data is sent using an ip protocol in a data communications network | |
US6539300B2 (en) * | 2001-07-10 | 2003-03-25 | Makor Issues And Rights Ltd. | Method for regional system wide optimal signal timing for traffic control based on wireless phone networks |
US7342489B1 (en) * | 2001-09-06 | 2008-03-11 | Siemens Schweiz Ag | Surveillance system control unit |
US7242295B1 (en) * | 2001-09-06 | 2007-07-10 | Vistascape Security Systems Corp. | Security data management system |
KR100459893B1 (en) * | 2002-01-08 | 2004-12-04 | 삼성전자주식회사 | Method and apparatus for color-based object tracking in video sequences |
KR100959470B1 (en) * | 2002-03-22 | 2010-05-25 | 마이클 에프. 디어링 | Scalable high performance 3d graphics |
US7149974B2 (en) * | 2002-04-03 | 2006-12-12 | Fuji Xerox Co., Ltd. | Reduced representations of video sequences |
US7324981B2 (en) | 2002-05-16 | 2008-01-29 | Microsoft Corporation | System and method of employing efficient operators for Bayesian network search |
US7222300B2 (en) * | 2002-06-19 | 2007-05-22 | Microsoft Corporation | System and method for automatically authoring video compositions using video cliplets |
US7221775B2 (en) * | 2002-11-12 | 2007-05-22 | Intellivid Corporation | Method and apparatus for computerized image background analysis |
US7275210B2 (en) * | 2003-03-21 | 2007-09-25 | Fuji Xerox Co., Ltd. | Systems and methods for generating video summary image layouts |
JP4568009B2 (en) * | 2003-04-22 | 2010-10-27 | パナソニック株式会社 | Monitoring device with camera cooperation |
US7817716B2 (en) * | 2003-05-29 | 2010-10-19 | Lsi Corporation | Method and/or apparatus for analyzing the content of a surveillance image |
US7242423B2 (en) * | 2003-06-16 | 2007-07-10 | Active Eye, Inc. | Linking zones for object tracking and camera handoff |
JP4195991B2 (en) * | 2003-06-18 | 2008-12-17 | パナソニック株式会社 | Surveillance video monitoring system, surveillance video generation method, and surveillance video monitoring server |
US7664292B2 (en) * | 2003-12-03 | 2010-02-16 | Safehouse International, Inc. | Monitoring an output from a camera |
US7697026B2 (en) | 2004-03-16 | 2010-04-13 | 3Vr Security, Inc. | Pipeline architecture for analyzing multiple video streams |
EP2408192A3 (en) * | 2004-04-16 | 2014-01-01 | James A. Aman | Multiple view compositing and object tracking system |
US7752548B2 (en) * | 2004-10-29 | 2010-07-06 | Microsoft Corporation | Features such as titles, transitions, and/or effects which vary according to positions |
US20060161335A1 (en) * | 2005-01-14 | 2006-07-20 | Ross Beinhaker | Routing system and method |
WO2007094802A2 (en) * | 2005-03-25 | 2007-08-23 | Intellivid Corporation | Intelligent camera selection and object tracking |
US7843491B2 (en) * | 2005-04-05 | 2010-11-30 | 3Vr Security, Inc. | Monitoring and presenting video surveillance data |
US7720257B2 (en) * | 2005-06-16 | 2010-05-18 | Honeywell International Inc. | Object tracking system |
US8089563B2 (en) * | 2005-06-17 | 2012-01-03 | Fuji Xerox Co., Ltd. | Method and system for analyzing fixed-camera video via the selection, visualization, and interaction with storyboard keyframes |
US8284254B2 (en) * | 2005-08-11 | 2012-10-09 | Sightlogix, Inc. | Methods and apparatus for a wide area coordinated surveillance system |
DE102005046664B4 (en) * | 2005-09-29 | 2016-11-17 | Robert Bosch Gmbh | Method for creating a flexible display area for a video surveillance system |
US7912628B2 (en) * | 2006-03-03 | 2011-03-22 | Inrix, Inc. | Determining road traffic conditions using data from multiple data sources |
US7823056B1 (en) * | 2006-03-15 | 2010-10-26 | Adobe Systems Incorporated | Multiple-camera video recording |
US8738013B2 (en) * | 2006-04-24 | 2014-05-27 | Marvell World Trade Ltd. | 802.11 mesh architecture |
US7546398B2 (en) * | 2006-08-01 | 2009-06-09 | International Business Machines Corporation | System and method for distributing virtual input/output operations across multiple logical partitions |
US8274564B2 (en) * | 2006-10-13 | 2012-09-25 | Fuji Xerox Co., Ltd. | Interface for browsing and viewing video from multiple cameras simultaneously that conveys spatial and temporal proximity |
US7843820B2 (en) * | 2006-10-30 | 2010-11-30 | Research In Motion Limited | Wi-Fi quality of service signaling |
US8125520B2 (en) * | 2006-11-21 | 2012-02-28 | Honeywell International Inc. | System for establishing references for multi-image stitching |
US8428152B2 (en) * | 2006-12-15 | 2013-04-23 | Amimon Ltd. | Device, method and system of uplink communication between wireless video modules |
US8760519B2 (en) * | 2007-02-16 | 2014-06-24 | Panasonic Corporation | Threat-detection in a distributed multi-camera surveillance system |
WO2008103850A2 (en) * | 2007-02-21 | 2008-08-28 | Pixel Velocity, Inc. | Scalable system for wide area surveillance |
US7777783B1 (en) * | 2007-03-23 | 2010-08-17 | Proximex Corporation | Multi-video navigation |
-
2008
- 2008-05-22 US US12/125,122 patent/US8350908B2/en active Active
- 2008-05-22 WO PCT/US2008/064582 patent/WO2008147915A2/en active Application Filing
- 2008-05-22 US US12/125,134 patent/US8527655B2/en active Active
- 2008-05-22 WO PCT/US2008/064512 patent/WO2008147874A2/en active Application Filing
- 2008-05-22 WO PCT/US2008/064587 patent/WO2009032366A2/en active Application Filing
- 2008-05-22 US US12/125,115 patent/US20080294588A1/en not_active Abandoned
- 2008-05-22 US US12/125,124 patent/US8356249B2/en active Active
- 2008-05-22 WO PCT/US2008/064578 patent/WO2008147913A2/en active Application Filing
Patent Citations (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20010039579A1 (en) * | 1996-11-06 | 2001-11-08 | Milan V. Trcka | Network security and surveillance system |
US20010029613A1 (en) * | 1998-03-19 | 2001-10-11 | Fernandez Dennis Sunga | Integrated network for monitoring remote objects |
US6408404B1 (en) * | 1998-07-29 | 2002-06-18 | Northrop Grumman Corporation | System and method for ensuring and managing situation awareness |
US6604093B1 (en) * | 1999-12-27 | 2003-08-05 | International Business Machines Corporation | Situation awareness system |
US20050162515A1 (en) * | 2000-10-24 | 2005-07-28 | Objectvideo, Inc. | Video surveillance system |
US7868912B2 (en) * | 2000-10-24 | 2011-01-11 | Objectvideo, Inc. | Video surveillance system employing video primitives |
US6906709B1 (en) * | 2001-02-27 | 2005-06-14 | Applied Visions, Inc. | Visualizing security incidents in a computer network |
US7380279B2 (en) * | 2001-07-16 | 2008-05-27 | Lenel Systems International, Inc. | System for integrating security and access for facilities and information systems |
US7864980B2 (en) * | 2002-08-15 | 2011-01-04 | Roke Manor Research Limited | Video motion anomaly detector |
US20040143602A1 (en) * | 2002-10-18 | 2004-07-22 | Antonio Ruiz | Apparatus, system and method for automated and adaptive digital image/video surveillance for events and configurations using a rich multimedia relational database |
US8176527B1 (en) * | 2002-12-02 | 2012-05-08 | Hewlett-Packard Development Company, L. P. | Correlation engine with support for time-based rules |
US20040105006A1 (en) * | 2002-12-03 | 2004-06-03 | Lazo Philip A. | Event driven video tracking system |
US7467400B1 (en) * | 2003-02-14 | 2008-12-16 | S2 Security Corporation | Integrated security system having network enabled access control and interface devices |
US20040223054A1 (en) * | 2003-05-06 | 2004-11-11 | Rotholtz Ben Aaron | Multi-purpose video surveillance |
US20050110637A1 (en) * | 2003-11-26 | 2005-05-26 | International Business Machines Corporation | System and method for alarm generation based on the detection of the presence of a person |
US20050132414A1 (en) * | 2003-12-02 | 2005-06-16 | Connexed, Inc. | Networked video surveillance system |
US20050222811A1 (en) * | 2004-04-03 | 2005-10-06 | Altusys Corp | Method and Apparatus for Context-Sensitive Event Correlation with External Control in Situation-Based Management |
US20070035622A1 (en) * | 2004-06-01 | 2007-02-15 | Hanna Keith J | Method and apparatus for video surveillance |
US20060066722A1 (en) * | 2004-09-28 | 2006-03-30 | Objectvideo, Inc. | View handling in video surveillance systems |
US20080208828A1 (en) * | 2005-03-21 | 2008-08-28 | Oren Boiman | Detecting Irregularities |
US7613322B2 (en) * | 2005-05-19 | 2009-11-03 | Objectvideo, Inc. | Periodic motion detection with applications to multi-grabbing |
US20060262958A1 (en) * | 2005-05-19 | 2006-11-23 | Objectvideo, Inc. | Periodic motion detection with applications to multi-grabbing |
US7768548B2 (en) * | 2005-08-12 | 2010-08-03 | William Bradford Silvernail | Mobile digital video recording system |
US7599957B2 (en) * | 2006-02-15 | 2009-10-06 | Panasonic Corporation | System and method for high performance template driven metadata schema mapping and data storage for surveillance and sensor devices |
US20070282665A1 (en) * | 2006-06-02 | 2007-12-06 | Buehler Christopher J | Systems and methods for providing video surveillance data |
US20080209505A1 (en) * | 2006-08-14 | 2008-08-28 | Quantum Secure, Inc. | Policy-based physical security system for restricting access to computer resources and data flow through network equipment |
US20080184245A1 (en) * | 2007-01-30 | 2008-07-31 | March Networks Corporation | Method and system for task-based video analytics processing |
US7840515B2 (en) * | 2007-02-16 | 2010-11-23 | Panasonic Corporation | System architecture and process for automating intelligent surveillance center operations |
US20080198159A1 (en) * | 2007-02-16 | 2008-08-21 | Matsushita Electric Industrial Co., Ltd. | Method and apparatus for efficient and flexible surveillance visualization with context sensitive privacy preserving and power lens data mining |
US8195598B2 (en) * | 2007-11-16 | 2012-06-05 | Agilence, Inc. | Method of and system for hierarchical human/crowd behavior detection |
Cited By (175)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10559193B2 (en) | 2002-02-01 | 2020-02-11 | Comcast Cable Communications, Llc | Premises management systems |
US11601397B2 (en) | 2004-03-16 | 2023-03-07 | Icontrol Networks, Inc. | Premises management configuration and control |
US10691295B2 (en) | 2004-03-16 | 2020-06-23 | Icontrol Networks, Inc. | User interface in a premises network |
US11082395B2 (en) | 2004-03-16 | 2021-08-03 | Icontrol Networks, Inc. | Premises management configuration and control |
US11916870B2 (en) | 2004-03-16 | 2024-02-27 | Icontrol Networks, Inc. | Gateway registry methods and systems |
US11893874B2 (en) | 2004-03-16 | 2024-02-06 | Icontrol Networks, Inc. | Networked touchscreen with integrated interfaces |
US11043112B2 (en) | 2004-03-16 | 2021-06-22 | Icontrol Networks, Inc. | Integrated security system with parallel processing architecture |
US11811845B2 (en) | 2004-03-16 | 2023-11-07 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US11037433B2 (en) | 2004-03-16 | 2021-06-15 | Icontrol Networks, Inc. | Management of a security system at a premises |
US11810445B2 (en) | 2004-03-16 | 2023-11-07 | Icontrol Networks, Inc. | Cross-client sensor user interface in an integrated security network |
US11184322B2 (en) | 2004-03-16 | 2021-11-23 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US10992784B2 (en) | 2004-03-16 | 2021-04-27 | Control Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US11782394B2 (en) | 2004-03-16 | 2023-10-10 | Icontrol Networks, Inc. | Automation system with mobile interface |
US11757834B2 (en) | 2004-03-16 | 2023-09-12 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US10979389B2 (en) | 2004-03-16 | 2021-04-13 | Icontrol Networks, Inc. | Premises management configuration and control |
US10890881B2 (en) | 2004-03-16 | 2021-01-12 | Icontrol Networks, Inc. | Premises management networking |
US11153266B2 (en) | 2004-03-16 | 2021-10-19 | Icontrol Networks, Inc. | Gateway registry methods and systems |
US11677577B2 (en) | 2004-03-16 | 2023-06-13 | Icontrol Networks, Inc. | Premises system management using status signal |
US10796557B2 (en) | 2004-03-16 | 2020-10-06 | Icontrol Networks, Inc. | Automation system user interface with three-dimensional display |
US10754304B2 (en) | 2004-03-16 | 2020-08-25 | Icontrol Networks, Inc. | Automation system with mobile interface |
US11656667B2 (en) | 2004-03-16 | 2023-05-23 | Icontrol Networks, Inc. | Integrated security system with parallel processing architecture |
US11626006B2 (en) | 2004-03-16 | 2023-04-11 | Icontrol Networks, Inc. | Management of a security system at a premises |
US11625008B2 (en) | 2004-03-16 | 2023-04-11 | Icontrol Networks, Inc. | Premises management networking |
US11159484B2 (en) | 2004-03-16 | 2021-10-26 | Icontrol Networks, Inc. | Forming a security network including integrated security system components and network devices |
US10735249B2 (en) | 2004-03-16 | 2020-08-04 | Icontrol Networks, Inc. | Management of a security system at a premises |
US11175793B2 (en) | 2004-03-16 | 2021-11-16 | Icontrol Networks, Inc. | User interface in a premises network |
US11182060B2 (en) | 2004-03-16 | 2021-11-23 | Icontrol Networks, Inc. | Networked touchscreen with integrated interfaces |
US10692356B2 (en) | 2004-03-16 | 2020-06-23 | Icontrol Networks, Inc. | Control system user interface |
US11201755B2 (en) | 2004-03-16 | 2021-12-14 | Icontrol Networks, Inc. | Premises system management using status signal |
US11588787B2 (en) | 2004-03-16 | 2023-02-21 | Icontrol Networks, Inc. | Premises management configuration and control |
US11537186B2 (en) | 2004-03-16 | 2022-12-27 | Icontrol Networks, Inc. | Integrated security system with parallel processing architecture |
US11244545B2 (en) | 2004-03-16 | 2022-02-08 | Icontrol Networks, Inc. | Cross-client sensor user interface in an integrated security network |
US11489812B2 (en) | 2004-03-16 | 2022-11-01 | Icontrol Networks, Inc. | Forming a security network including integrated security system components and network devices |
US11449012B2 (en) | 2004-03-16 | 2022-09-20 | Icontrol Networks, Inc. | Premises management networking |
US11277465B2 (en) | 2004-03-16 | 2022-03-15 | Icontrol Networks, Inc. | Generating risk profile using data of home monitoring and security system |
US11310199B2 (en) | 2004-03-16 | 2022-04-19 | Icontrol Networks, Inc. | Premises management configuration and control |
US11343380B2 (en) | 2004-03-16 | 2022-05-24 | Icontrol Networks, Inc. | Premises system automation |
US11368429B2 (en) | 2004-03-16 | 2022-06-21 | Icontrol Networks, Inc. | Premises management configuration and control |
US11378922B2 (en) | 2004-03-16 | 2022-07-05 | Icontrol Networks, Inc. | Automation system with mobile interface |
US11410531B2 (en) | 2004-03-16 | 2022-08-09 | Icontrol Networks, Inc. | Automation system user interface with three-dimensional display |
US20060093190A1 (en) * | 2004-09-17 | 2006-05-04 | Proximex Corporation | Adaptive multi-modal integrated biometric identification detection and surveillance systems |
US9432632B2 (en) | 2004-09-17 | 2016-08-30 | Proximex Corporation | Adaptive multi-modal integrated biometric identification and surveillance systems |
US7956890B2 (en) | 2004-09-17 | 2011-06-07 | Proximex Corporation | Adaptive multi-modal integrated biometric identification detection and surveillance systems |
US8976237B2 (en) | 2004-09-17 | 2015-03-10 | Proximex Corporation | Adaptive multi-modal integrated biometric identification detection and surveillance systems |
US11424980B2 (en) | 2005-03-16 | 2022-08-23 | Icontrol Networks, Inc. | Forming a security network including integrated security system components |
US11706045B2 (en) | 2005-03-16 | 2023-07-18 | Icontrol Networks, Inc. | Modular electronic display platform |
US11700142B2 (en) | 2005-03-16 | 2023-07-11 | Icontrol Networks, Inc. | Security network integrating security system and network devices |
US11451409B2 (en) | 2005-03-16 | 2022-09-20 | Icontrol Networks, Inc. | Security network integrating security system and network devices |
US11496568B2 (en) | 2005-03-16 | 2022-11-08 | Icontrol Networks, Inc. | Security system with networked touchscreen |
US11367340B2 (en) | 2005-03-16 | 2022-06-21 | Icontrol Networks, Inc. | Premise management systems and methods |
US11595364B2 (en) | 2005-03-16 | 2023-02-28 | Icontrol Networks, Inc. | System for data routing in networks |
US11824675B2 (en) | 2005-03-16 | 2023-11-21 | Icontrol Networks, Inc. | Networked touchscreen with integrated interfaces |
US10841381B2 (en) | 2005-03-16 | 2020-11-17 | Icontrol Networks, Inc. | Security system with networked touchscreen |
US10999254B2 (en) | 2005-03-16 | 2021-05-04 | Icontrol Networks, Inc. | System for data routing in networks |
US11792330B2 (en) | 2005-03-16 | 2023-10-17 | Icontrol Networks, Inc. | Communication and automation in a premises management system |
US10930136B2 (en) | 2005-03-16 | 2021-02-23 | Icontrol Networks, Inc. | Premise management systems and methods |
US11615697B2 (en) | 2005-03-16 | 2023-03-28 | Icontrol Networks, Inc. | Premise management systems and methods |
US11113950B2 (en) | 2005-03-16 | 2021-09-07 | Icontrol Networks, Inc. | Gateway integrated with premises security system |
US10721087B2 (en) | 2005-03-16 | 2020-07-21 | Icontrol Networks, Inc. | Method for networked touchscreen with integrated interfaces |
US11418518B2 (en) | 2006-06-12 | 2022-08-16 | Icontrol Networks, Inc. | Activation of gateway device |
US10616244B2 (en) | 2006-06-12 | 2020-04-07 | Icontrol Networks, Inc. | Activation of gateway device |
US10785319B2 (en) | 2006-06-12 | 2020-09-22 | Icontrol Networks, Inc. | IP device discovery systems and methods |
US11412027B2 (en) | 2007-01-24 | 2022-08-09 | Icontrol Networks, Inc. | Methods and systems for data communication |
US11418572B2 (en) | 2007-01-24 | 2022-08-16 | Icontrol Networks, Inc. | Methods and systems for improved system performance |
US11706279B2 (en) | 2007-01-24 | 2023-07-18 | Icontrol Networks, Inc. | Methods and systems for data communication |
US11194320B2 (en) | 2007-02-28 | 2021-12-07 | Icontrol Networks, Inc. | Method and system for managing communication connectivity |
US10747216B2 (en) | 2007-02-28 | 2020-08-18 | Icontrol Networks, Inc. | Method and system for communicating with and controlling an alarm system from a remote server |
US10657794B1 (en) | 2007-02-28 | 2020-05-19 | Icontrol Networks, Inc. | Security, monitoring and automation controller access and use of legacy security control panel information |
US11809174B2 (en) | 2007-02-28 | 2023-11-07 | Icontrol Networks, Inc. | Method and system for managing communication connectivity |
US9544563B1 (en) | 2007-03-23 | 2017-01-10 | Proximex Corporation | Multi-video navigation system |
US9544496B1 (en) | 2007-03-23 | 2017-01-10 | Proximex Corporation | Multi-video navigation |
US20170085803A1 (en) * | 2007-03-23 | 2017-03-23 | Proximex Corporation | Multi-video navigation |
US10484611B2 (en) * | 2007-03-23 | 2019-11-19 | Sensormatic Electronics, LLC | Multi-video navigation |
US10326940B2 (en) | 2007-03-23 | 2019-06-18 | Proximex Corporation | Multi-video navigation system |
US11663902B2 (en) | 2007-04-23 | 2023-05-30 | Icontrol Networks, Inc. | Method and system for providing alternate network access |
US11132888B2 (en) | 2007-04-23 | 2021-09-28 | Icontrol Networks, Inc. | Method and system for providing alternate network access |
US10672254B2 (en) | 2007-04-23 | 2020-06-02 | Icontrol Networks, Inc. | Method and system for providing alternate network access |
US10523689B2 (en) | 2007-06-12 | 2019-12-31 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US11237714B2 (en) | 2007-06-12 | 2022-02-01 | Control Networks, Inc. | Control system user interface |
US11423756B2 (en) | 2007-06-12 | 2022-08-23 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11089122B2 (en) | 2007-06-12 | 2021-08-10 | Icontrol Networks, Inc. | Controlling data routing among networks |
US11894986B2 (en) | 2007-06-12 | 2024-02-06 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11582065B2 (en) | 2007-06-12 | 2023-02-14 | Icontrol Networks, Inc. | Systems and methods for device communication |
US11601810B2 (en) | 2007-06-12 | 2023-03-07 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11316753B2 (en) | 2007-06-12 | 2022-04-26 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11611568B2 (en) | 2007-06-12 | 2023-03-21 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US10666523B2 (en) | 2007-06-12 | 2020-05-26 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11625161B2 (en) | 2007-06-12 | 2023-04-11 | Icontrol Networks, Inc. | Control system user interface |
US11632308B2 (en) | 2007-06-12 | 2023-04-18 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US10616075B2 (en) | 2007-06-12 | 2020-04-07 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11722896B2 (en) | 2007-06-12 | 2023-08-08 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11212192B2 (en) | 2007-06-12 | 2021-12-28 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11218878B2 (en) | 2007-06-12 | 2022-01-04 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11646907B2 (en) | 2007-06-12 | 2023-05-09 | Icontrol Networks, Inc. | Communication protocols in integrated systems |
US11815969B2 (en) | 2007-08-10 | 2023-11-14 | Icontrol Networks, Inc. | Integrated security system with parallel processing architecture |
US11831462B2 (en) | 2007-08-24 | 2023-11-28 | Icontrol Networks, Inc. | Controlling data routing in premises management systems |
US11916928B2 (en) | 2008-01-24 | 2024-02-27 | Icontrol Networks, Inc. | Communication protocols over internet protocol (IP) networks |
US11816323B2 (en) | 2008-06-25 | 2023-11-14 | Icontrol Networks, Inc. | Automation system user interface |
US11190578B2 (en) | 2008-08-11 | 2021-11-30 | Icontrol Networks, Inc. | Integrated cloud system with lightweight gateway for premises automation |
US11758026B2 (en) | 2008-08-11 | 2023-09-12 | Icontrol Networks, Inc. | Virtual device systems and methods |
US11258625B2 (en) | 2008-08-11 | 2022-02-22 | Icontrol Networks, Inc. | Mobile premises automation platform |
US11616659B2 (en) | 2008-08-11 | 2023-03-28 | Icontrol Networks, Inc. | Integrated cloud system for premises automation |
US11368327B2 (en) | 2008-08-11 | 2022-06-21 | Icontrol Networks, Inc. | Integrated cloud system for premises automation |
US11792036B2 (en) | 2008-08-11 | 2023-10-17 | Icontrol Networks, Inc. | Mobile premises automation platform |
US11316958B2 (en) | 2008-08-11 | 2022-04-26 | Icontrol Networks, Inc. | Virtual device systems and methods |
US11641391B2 (en) | 2008-08-11 | 2023-05-02 | Icontrol Networks Inc. | Integrated cloud system with lightweight gateway for premises automation |
US11729255B2 (en) | 2008-08-11 | 2023-08-15 | Icontrol Networks, Inc. | Integrated cloud system with lightweight gateway for premises automation |
US11711234B2 (en) | 2008-08-11 | 2023-07-25 | Icontrol Networks, Inc. | Integrated cloud system for premises automation |
US20100135157A1 (en) * | 2008-12-02 | 2010-06-03 | Kim Sang Wan | Method and apparatus for controlling traffic according to user |
US8149699B2 (en) * | 2008-12-02 | 2012-04-03 | Electronics And Telecommunications Research Institute | Method and apparatus for controlling traffic according to user |
US20100138842A1 (en) * | 2008-12-03 | 2010-06-03 | Soren Balko | Multithreading And Concurrency Control For A Rule-Based Transaction Engine |
US10002161B2 (en) * | 2008-12-03 | 2018-06-19 | Sap Se | Multithreading and concurrency control for a rule-based transaction engine |
US11553399B2 (en) | 2009-04-30 | 2023-01-10 | Icontrol Networks, Inc. | Custom content for premises management |
US11356926B2 (en) | 2009-04-30 | 2022-06-07 | Icontrol Networks, Inc. | Hardware configurable security, monitoring and automation controller having modular communication protocol interfaces |
US11129084B2 (en) | 2009-04-30 | 2021-09-21 | Icontrol Networks, Inc. | Notification of event subsequent to communication failure with security system |
US11223998B2 (en) | 2009-04-30 | 2022-01-11 | Icontrol Networks, Inc. | Security, monitoring and automation controller access and use of legacy security control panel information |
US11665617B2 (en) | 2009-04-30 | 2023-05-30 | Icontrol Networks, Inc. | Server-based notification of alarm event subsequent to communication failure with armed security system |
US11856502B2 (en) | 2009-04-30 | 2023-12-26 | Icontrol Networks, Inc. | Method, system and apparatus for automated inventory reporting of security, monitoring and automation hardware and software at customer premises |
US11284331B2 (en) | 2009-04-30 | 2022-03-22 | Icontrol Networks, Inc. | Server-based notification of alarm event subsequent to communication failure with armed security system |
US11778534B2 (en) | 2009-04-30 | 2023-10-03 | Icontrol Networks, Inc. | Hardware configurable security, monitoring and automation controller having modular communication protocol interfaces |
US10813034B2 (en) | 2009-04-30 | 2020-10-20 | Icontrol Networks, Inc. | Method, system and apparatus for management of applications for an SMA controller |
US11601865B2 (en) | 2009-04-30 | 2023-03-07 | Icontrol Networks, Inc. | Server-based notification of alarm event subsequent to communication failure with armed security system |
US10674428B2 (en) | 2009-04-30 | 2020-06-02 | Icontrol Networks, Inc. | Hardware configurable security, monitoring and automation controller having modular communication protocol interfaces |
US9521284B2 (en) | 2010-05-21 | 2016-12-13 | Hand Held Products, Inc. | Interactive user interface for capturing a document in an image signal |
US9451132B2 (en) | 2010-05-21 | 2016-09-20 | Hand Held Products, Inc. | System for capturing a document in an image signal |
US9319548B2 (en) | 2010-05-21 | 2016-04-19 | Hand Held Products, Inc. | Interactive user interface for capturing a document in an image signal |
US8600167B2 (en) | 2010-05-21 | 2013-12-03 | Hand Held Products, Inc. | System for capturing a document in an image signal |
US9047531B2 (en) | 2010-05-21 | 2015-06-02 | Hand Held Products, Inc. | Interactive user interface for capturing a document in an image signal |
US11398147B2 (en) | 2010-09-28 | 2022-07-26 | Icontrol Networks, Inc. | Method, system and apparatus for automated reporting of account and sensor zone information to a central station |
US11900790B2 (en) | 2010-09-28 | 2024-02-13 | Icontrol Networks, Inc. | Method, system and apparatus for automated reporting of account and sensor zone information to a central station |
US11750414B2 (en) | 2010-12-16 | 2023-09-05 | Icontrol Networks, Inc. | Bidirectional security sensor communication for a premises security system |
US10741057B2 (en) | 2010-12-17 | 2020-08-11 | Icontrol Networks, Inc. | Method and system for processing security event data |
US11341840B2 (en) | 2010-12-17 | 2022-05-24 | Icontrol Networks, Inc. | Method and system for processing security event data |
US11240059B2 (en) | 2010-12-20 | 2022-02-01 | Icontrol Networks, Inc. | Defining and implementing sensor triggered response rules |
US9020832B2 (en) | 2011-03-07 | 2015-04-28 | KBA2 Inc. | Systems and methods for analytic data gathering from image providers at an event or geographic location |
US8527340B2 (en) | 2011-03-07 | 2013-09-03 | Kba2, Inc. | Systems and methods for analytic data gathering from image providers at an event or geographic location |
US20120311562A1 (en) * | 2011-06-01 | 2012-12-06 | Yanlin Wang | Extendable event processing |
US9131129B2 (en) | 2011-06-17 | 2015-09-08 | Hand Held Products, Inc. | Terminal operative for storing frame of image data |
US8628016B2 (en) | 2011-06-17 | 2014-01-14 | Hand Held Products, Inc. | Terminal operative for storing frame of image data |
US9965769B1 (en) * | 2012-06-11 | 2018-05-08 | Retailmenot, Inc. | Devices, methods, and computer-readable media for redemption header for merchant offers |
US10664857B2 (en) | 2012-06-11 | 2020-05-26 | Retailmenot, Inc. | Determining offers for a geofenced geographic area |
US9558507B2 (en) | 2012-06-11 | 2017-01-31 | Retailmenot, Inc. | Reminding users of offers |
US9953335B2 (en) * | 2012-06-11 | 2018-04-24 | Retailmenot, Inc. | Devices, methods, and computer-readable media for redemption header for merchant offers |
US9639853B2 (en) | 2012-06-11 | 2017-05-02 | Retailmenot, Inc. | Devices, methods, and computer-readable media for redemption header for merchant offers |
US9881315B2 (en) | 2012-06-11 | 2018-01-30 | Retailmenot, Inc. | Systems, methods, and computer-readable media for a customizable redemption header for merchant offers across browser instances |
US10346867B2 (en) | 2012-06-11 | 2019-07-09 | Retailmenot, Inc. | Intents for offer-discovery systems |
US9442981B2 (en) | 2012-08-17 | 2016-09-13 | Splunk Inc. | Previewing parsed raw data using a graphical user interface |
US9208206B2 (en) * | 2012-08-17 | 2015-12-08 | Splunk Inc. | Selecting parsing rules based on data analysis |
US9740788B2 (en) | 2012-08-17 | 2017-08-22 | Splunk, Inc. | Interactive selection and display of a raw data parsing rule |
US10650069B2 (en) | 2012-08-17 | 2020-05-12 | Splunk Inc. | Previewing raw data parsing |
US20140337354A1 (en) * | 2012-08-17 | 2014-11-13 | Splunk Inc. | Indexing Preview |
US11886502B2 (en) | 2012-08-17 | 2024-01-30 | Splunk Inc. | Facilitating event creation via previews |
US9264474B2 (en) | 2013-05-07 | 2016-02-16 | KBA2 Inc. | System and method of portraying the shifting level of interest in an object or location |
US11296950B2 (en) | 2013-06-27 | 2022-04-05 | Icontrol Networks, Inc. | Control system user interface |
US11405463B2 (en) | 2014-03-03 | 2022-08-02 | Icontrol Networks, Inc. | Media content management |
US11146637B2 (en) | 2014-03-03 | 2021-10-12 | Icontrol Networks, Inc. | Media content management |
US20180278650A1 (en) * | 2014-09-14 | 2018-09-27 | Sophos Limited | Normalized indications of compromise |
US10841339B2 (en) * | 2014-09-14 | 2020-11-17 | Sophos Limited | Normalized indications of compromise |
WO2016060741A1 (en) * | 2014-10-15 | 2016-04-21 | Ayla Networks, Inc. | Flexible rules engine for managing connected consumer devices |
EP3207456A4 (en) * | 2014-10-15 | 2018-05-30 | Ayla Networks, Inc. | Flexible rules engine for managing connected consumer devices |
CN107003924A (en) * | 2014-10-15 | 2017-08-01 | 艾拉物联公司 | The elastic regulation engine of consumer device has been connected for managing |
US9729383B2 (en) | 2014-10-15 | 2017-08-08 | Ayla Networks, Inc. | Flexible rules engine for managing connected consumer devices |
US9811739B2 (en) * | 2014-11-05 | 2017-11-07 | Vivotek Inc. | Surveillance system and surveillance method |
US20160125247A1 (en) * | 2014-11-05 | 2016-05-05 | Vivotek Inc. | Surveillance system and surveillance method |
US11252168B2 (en) | 2015-12-22 | 2022-02-15 | Sap Se | System and user context in enterprise threat detection |
US20170178026A1 (en) * | 2015-12-22 | 2017-06-22 | Sap Se | Log normalization in enterprise threat detection |
US10083359B2 (en) * | 2016-04-28 | 2018-09-25 | Motorola Solutions, Inc. | Method and device for incident situation prediction |
CN109286780A (en) * | 2017-07-19 | 2019-01-29 | 和硕联合科技股份有限公司 | Video monitoring system and video monitoring method |
US20190026564A1 (en) * | 2017-07-19 | 2019-01-24 | Pegatron Corporation | Video surveillance system and video surveillance method |
US10558863B2 (en) * | 2017-07-19 | 2020-02-11 | Pegatron Corporation | Video surveillance system and video surveillance method |
US20200134741A1 (en) * | 2018-02-20 | 2020-04-30 | Osram Gmbh | Controlled Agricultural Systems and Methods of Managing Agricultural Systems |
RU2805125C1 (en) * | 2020-03-27 | 2023-10-11 | Ханчжоу Хиквижен Диджитал Текнолоджи Ко., Лтд. | Method, system and device for determining action |
WO2021190284A1 (en) * | 2020-03-27 | 2021-09-30 | 杭州海康威视数字技术股份有限公司 | Method, system, and device for determining action |
WO2022081720A1 (en) * | 2020-10-16 | 2022-04-21 | Splunk Inc. | Rule-based data stream processing |
US11669551B2 (en) | 2020-10-16 | 2023-06-06 | Splunk Inc. | Rule-based data stream processing |
Also Published As
Publication number | Publication date |
---|---|
US20080292140A1 (en) | 2008-11-27 |
US8350908B2 (en) | 2013-01-08 |
WO2008147874A2 (en) | 2008-12-04 |
WO2008147874A3 (en) | 2009-02-26 |
WO2009032366A3 (en) | 2009-06-11 |
WO2008147915A3 (en) | 2009-01-22 |
WO2008147913A2 (en) | 2008-12-04 |
WO2008147913A3 (en) | 2009-01-22 |
US20120179835A1 (en) | 2012-07-12 |
WO2009032366A2 (en) | 2009-03-12 |
US20080294990A1 (en) | 2008-11-27 |
US8527655B2 (en) | 2013-09-03 |
US8356249B2 (en) | 2013-01-15 |
WO2008147915A2 (en) | 2008-12-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20080294588A1 (en) | Event capture, cross device event correlation, and responsive actions | |
US20230246905A1 (en) | Cooperative monitoring networks | |
US10587460B2 (en) | Systems and methods for correlating sensory events and legacy system events utilizing a correlation engine for security, safety, and business productivity | |
US9619984B2 (en) | Systems and methods for correlating data from IP sensor networks for security, safety, and business productivity applications | |
US7999847B2 (en) | Audio-video tip analysis, storage, and alerting system for safety, security, and business productivity | |
US20050132414A1 (en) | Networked video surveillance system | |
US20080303903A1 (en) | Networked video surveillance system | |
US7693589B2 (en) | Anomaly anti-pattern | |
US7710257B2 (en) | Pattern driven effectuator system | |
AU2010291859A1 (en) | Video camera system | |
US20080168531A1 (en) | Method, system and program product for alerting an information technology support organization of a security event | |
Norman | Integrated security systems design: A complete reference for building enterprise-wide digital security systems | |
US20230208869A1 (en) | Generative artificial intelligence method and system configured to provide outputs for company compliance | |
US7710258B2 (en) | Emergent information pattern driven sensor networks | |
US7756593B2 (en) | Anomaly anti-pattern | |
US11503075B1 (en) | Systems and methods for continuous compliance of nodes | |
US20150287306A1 (en) | Proactive Loss Prevention System | |
Adams et al. | How port security has to evolve to address the cyber-physical security threat: lessons from the SAURON project | |
Buford et al. | Insider threat detection using situation-aware MAS | |
EP1915743A1 (en) | Physical security management system | |
Nieminen et al. | Multi-sensor logical decision making in the single location surveillance point system | |
Wright et al. | The illusion of security | |
KR101564046B1 (en) | Apparatus, method and computer readable recording medium for controlling entry by a board to board communication | |
CN102483838A (en) | Security management using social networking | |
US20230044156A1 (en) | Artificial intelligence-based system and method for facilitating management of threats for an organizaton |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: VIDSYS, INC., VIRGINIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MORRIS, STEPHEN JEFFREY;CLARKSON, RICHARD CASEY;BOLTON, STEVEN ARNOLD;AND OTHERS;REEL/FRAME:021231/0104;SIGNING DATES FROM 20080529 TO 20080713 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |