WO2016040018A1 - System and method for hierarchical sensor processing - Google Patents

System and method for hierarchical sensor processing Download PDF

Info

Publication number
WO2016040018A1
WO2016040018A1 PCT/US2015/047641 US2015047641W WO2016040018A1 WO 2016040018 A1 WO2016040018 A1 WO 2016040018A1 US 2015047641 W US2015047641 W US 2015047641W WO 2016040018 A1 WO2016040018 A1 WO 2016040018A1
Authority
WO
WIPO (PCT)
Prior art keywords
processing level
sensor data
processing
sensor
processor
Prior art date
Application number
PCT/US2015/047641
Other languages
French (fr)
Inventor
Stephen Lloyd
James B. Lim
Original Assignee
Invensense Incorporated
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/480,364 external-priority patent/US20150321903A1/en
Application filed by Invensense Incorporated filed Critical Invensense Incorporated
Publication of WO2016040018A1 publication Critical patent/WO2016040018A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B81MICROSTRUCTURAL TECHNOLOGY
    • B81CPROCESSES OR APPARATUS SPECIALLY ADAPTED FOR THE MANUFACTURE OR TREATMENT OF MICROSTRUCTURAL DEVICES OR SYSTEMS
    • B81C1/00Manufacture or treatment of devices or systems in or on a substrate
    • B81C1/00015Manufacture or treatment of devices or systems in or on a substrate for manufacturing microsystems
    • B81C1/00222Integrating an electronic processing unit with a micromechanical structure
    • B81C1/0023Packaging together an electronic processing unit die and a micromechanical structure die
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2224/00Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
    • H01L2224/01Means for bonding being attached to, or being formed on, the surface to be connected, e.g. chip-to-package, die-attach, "first-level" interconnects; Manufacturing methods related thereto
    • H01L2224/10Bump connectors; Manufacturing methods related thereto
    • H01L2224/15Structure, shape, material or disposition of the bump connectors after the connecting process
    • H01L2224/16Structure, shape, material or disposition of the bump connectors after the connecting process of an individual bump connector
    • H01L2224/161Disposition
    • H01L2224/16151Disposition the bump connector connecting between a semiconductor or solid-state body and an item not being a semiconductor or solid-state body, e.g. chip-to-substrate, chip-to-passive
    • H01L2224/16221Disposition the bump connector connecting between a semiconductor or solid-state body and an item not being a semiconductor or solid-state body, e.g. chip-to-substrate, chip-to-passive the body and the item being stacked
    • H01L2224/16225Disposition the bump connector connecting between a semiconductor or solid-state body and an item not being a semiconductor or solid-state body, e.g. chip-to-substrate, chip-to-passive the body and the item being stacked the item being non-metallic, e.g. insulating substrate with or without metallisation
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2224/00Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
    • H01L2224/01Means for bonding being attached to, or being formed on, the surface to be connected, e.g. chip-to-package, die-attach, "first-level" interconnects; Manufacturing methods related thereto
    • H01L2224/42Wire connectors; Manufacturing methods related thereto
    • H01L2224/47Structure, shape, material or disposition of the wire connectors after the connecting process
    • H01L2224/48Structure, shape, material or disposition of the wire connectors after the connecting process of an individual wire connector
    • H01L2224/4805Shape
    • H01L2224/4809Loop shape
    • H01L2224/48091Arched
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2224/00Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
    • H01L2224/01Means for bonding being attached to, or being formed on, the surface to be connected, e.g. chip-to-package, die-attach, "first-level" interconnects; Manufacturing methods related thereto
    • H01L2224/42Wire connectors; Manufacturing methods related thereto
    • H01L2224/47Structure, shape, material or disposition of the wire connectors after the connecting process
    • H01L2224/48Structure, shape, material or disposition of the wire connectors after the connecting process of an individual wire connector
    • H01L2224/481Disposition
    • H01L2224/48135Connecting between different semiconductor or solid-state bodies, i.e. chip-to-chip
    • H01L2224/48145Connecting between different semiconductor or solid-state bodies, i.e. chip-to-chip the bodies being stacked
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2224/00Indexing scheme for arrangements for connecting or disconnecting semiconductor or solid-state bodies and methods related thereto as covered by H01L24/00
    • H01L2224/01Means for bonding being attached to, or being formed on, the surface to be connected, e.g. chip-to-package, die-attach, "first-level" interconnects; Manufacturing methods related thereto
    • H01L2224/42Wire connectors; Manufacturing methods related thereto
    • H01L2224/47Structure, shape, material or disposition of the wire connectors after the connecting process
    • H01L2224/48Structure, shape, material or disposition of the wire connectors after the connecting process of an individual wire connector
    • H01L2224/481Disposition
    • H01L2224/48151Connecting between a semiconductor or solid-state body and an item not being a semiconductor or solid-state body, e.g. chip-to-substrate, chip-to-passive
    • H01L2224/48221Connecting between a semiconductor or solid-state body and an item not being a semiconductor or solid-state body, e.g. chip-to-substrate, chip-to-passive the body and the item being stacked
    • H01L2224/48225Connecting between a semiconductor or solid-state body and an item not being a semiconductor or solid-state body, e.g. chip-to-substrate, chip-to-passive the body and the item being stacked the item being non-metallic, e.g. insulating substrate with or without metallisation
    • H01L2224/48227Connecting between a semiconductor or solid-state body and an item not being a semiconductor or solid-state body, e.g. chip-to-substrate, chip-to-passive the body and the item being stacked the item being non-metallic, e.g. insulating substrate with or without metallisation connecting the wire to a bond pad of the item
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L2924/00Indexing scheme for arrangements or methods for connecting or disconnecting semiconductor or solid-state bodies as covered by H01L24/00
    • H01L2924/15Details of package parts other than the semiconductor or other solid state devices to be connected
    • H01L2924/151Die mounting substrate
    • H01L2924/1517Multilayer substrate
    • H01L2924/15182Fan-in arrangement of the internal vias
    • H01L2924/15184Fan-in arrangement of the internal vias in different layers of the multilayer substrate

Definitions

  • This present invention relates to integrated systems, such as those that may be arranged to include microelectromechanical systems (MEMS) that provide for signal processing, and more particularly for systems that provide for the processing of signals from sensors and also provide for the outputting of information from the processed signals to other devices, applications and arrangements.
  • MEMS microelectromechanical systems
  • the present invention addresses such a need and solution and is directed to such a need in overcoming the prior limitations in the field.
  • a method for processing sensor data hierarchically may include receiving sensor data input at a first processing level, performing a first operation on the received sensor data input at the first processing level, outputting processed sensor data from the first processing level to a second processing level, performing a second operation on the processed sensor data at the second processing level and outputting a result from the second operation to a third processing level.
  • Receiving sensor data input at the first processing level may include receiving sensor data from at least one embedded sensor that is integrated with a processor, from at least one embedded sensor that is integrated with memory, receiving raw sensor data from at least one external sensor and/or receiving processed sensor data from another hierarchical processing level.
  • a plurality of independent processors may be provided at the first processing level, such that each processor may perform an operation on received sensor data.
  • sensor data input may be received at the second processing level from a least one embedded sensor that is integrated with a processor.
  • at least one of the first operation and the second operation may include aggregating sensor data.
  • at least one of the first operation and the second operation comprises processing sensor data to increase an amount of information per bit and the amount of information per bit may be increased at both the first and the second processing levels.
  • sensor data input at the second processing level may be received from an external sensor.
  • a plurality of independent processors may be provided at the second processing level, such that each processor may perform an operation on processed sensor data output by another hierarchical processing level.
  • power management may be independently implemented at the first and second processing levels. Further, a power mode of one processing level may be changed based, at least in part, on a result of an operation at another processing level. Alternatively or in addition, one processing level may be transitioned between a power save mode and an active mode based, at least in part, on an operation performed at another processing level. Still further, an action at one processing level may be triggered based, at least in part, on an operation performed at another processing level.
  • the sensor data input received at one processing level comprises data from a set of sensors.
  • This disclosure may also include a system for processing sensor data, having at least one processor of a first processing level configured to receive sensor data input and perform a first operation on the received sensor data, at least one processor of a second processing level configured to receive processed sensor data from the first processing level and perform a second operation on the processed sensor data and at least one processor of a third processing level configured to receive a result of the second operation from the second processing level.
  • a system for processing sensor data having at least one processor of a first processing level configured to receive sensor data input and perform a first operation on the received sensor data, at least one processor of a second processing level configured to receive processed sensor data from the first processing level and perform a second operation on the processed sensor data and at least one processor of a third processing level configured to receive a result of the second operation from the second processing level.
  • the system may include at least one embedded sensor that is integrated with a processor of the first processing level providing the sensor data input.
  • the system may also include at least one embedded sensor that is integrated with memory providing the sensor data input.
  • the system may include at least one external sensor providing the sensor data input.
  • the system may include another hierarchical level providing the sensor data input.
  • the system may have a plurality of independent processors of the first processing level, such that each processor is configured to perform an operation on received sensor data.
  • the system may have at least one embedded sensor that is integrated with a processor of the second processing level and is configured to output sensor data to the second processing level.
  • At least one of the first operation and the second operation comprises aggregating sensor data. Further, at least one of the first operation and the second operation may include processing sensor data to increase an amount of information per bit and both first and the second processing levels may be configured to increase an amount of information per bit. [0018] Further, the system may include at least one external sensor that is configured to output sensor data to the second processing level. In addition, the system may have a plurality of independent processors at the second processing level, such that each processor may perform an operation on processed sensor data output by another hierarchical processing level.
  • the system may include a power management block configured to independently control the first and second processing levels.
  • the power management block may change a power mode of one processing level based, at least in part, on a result of an operation at another processing level.
  • the power management block may also transition one processing level between a power save mode and an active mode based, at least in part, on an operation performed at another processing level.
  • one processing level may perform an action based, at least in part, on an operation performed at another processing level.
  • Additional embodiments of the present disclosure provide for a device and system having a plurality of sensors and a sensor hub coupled to the plurality of sensors, for receiving outputs from the plurality of sensors to be implemented in computer programmable software and stored in computer readable media.
  • FIG.1 is an exemplary system diagram for the device and system herein having one or more embedded sensors and a sensor hub on a single chip, in accordance with one or more embodiments of the present invention.
  • FIG. 2A is an exemplary integrated sensor system (ISS) of the present invention having one or more embedded sensors in one or more MEMS chips and one or more CMOS chips with electronic circuits, in a single chip, in accordance with one or more embodiments of the present invention.
  • ISS integrated sensor system
  • FIG. 2B is an exemplary integrated sensor system (ISS) of the present invention having one or more MEMS chips and one or more CMOS chips vertically stacked and bonded on a substrate, in accordance with one or more embodiments of the present invention.
  • ISS integrated sensor system
  • FIG. 2C is an exemplary integrated sensor system (ISS) of the present invention having one or more MEMS chips and one or more CMOS chips vertically stacked and bonded on a substrate, in accordance with one or more embodiments of the present invention
  • ISS integrated sensor system
  • FIG.3 depicts a system diagram of the ISS in which the sensor hub comprises one or more analog to digital convertors, one or more processors, memory, a power management block and a controller block, in accordance with one or more embodiments of the present invention.
  • FIG. 4 schematically depicts a system architecture of sensor hubs for the hierarchical processing of sensor data, in accordance with one or more embodiments of the present invention.
  • FIG.5 schematically depicts a system having processing levels implemented in separate devices, in accordance with one or more embodiments of the present invention.
  • FIG. 6 schematically depicts a flow chart showing a routine for the hierarchical processing of sensor data, in accordance with one or more embodiments of the present invention.
  • the present invention relates to integrated systems arranged to include microelectromechanical systems (MEMS) that provide for signal processing and more particularly for those systems that provide for the processing of signals from sensors and also provide for the outputting of information from the processed signals to other devices, applications and arrangements.
  • MEMS microelectromechanical systems
  • the application relates to integrated sensor systems (ISSs) comprising one or more embedded sensors and a sensor hub arranged on a single chip, which can also receive inputs from external sensor sources and provide for facilitating efficient communication among the sensors for improved high-level features, such as interpreting gestures or actions according to the context.
  • ISSs integrated sensor systems
  • the application also relates to systems and methods for the hierarchical processing of sensor data, including sensor data from an embedded sensor in an ISS or from other sensor configurations.
  • MEMS Micro-Electro-Mechanical Systems
  • MEMS refers to a class of devices fabricated using semiconductor-like processes and exhibiting mechanical characteristics such as the ability to move or deform. MEMS often, but not always, interact with electrical signals. Silicon wafers containing MEMS structures are referred to as MEMS wafers.
  • MEMS device may refer to a semiconductor device implemented as a micro-electro-mechanical system.
  • a MEMS device includes mechanical elements and optionally includes electronics for sensing.
  • MEMS devices include but not limited to gyroscopes, accelerometers, magnetometers, and pressure sensors.
  • MEMS features refer to elements formed by MEMS fabrication process such as bump stop, damping hole, via, port, plate, proof mass, standoff, spring, seal ring, proof mass.
  • MEMS structure may refer to any feature that may be part of a larger MEMS device.
  • MEMS features comprising moveable elements are a MEMS structure.
  • IC substrate may refer to a silicon substrate with electrical circuits, typically CMOS circuits.
  • a chip includes at least one substrate typically formed from a semiconductor material.
  • a single chip may be formed from multiple substrates, where the substrates are mechanically bonded to preserve the functionality.
  • Multiple chips includes at least 2 substrates, wherein the 2 substrates are electrically connected, but do not require mechanical bonding.
  • a package provides electrical connection between the bond pads on the chip to a metal lead that can be soldered to a PCB.
  • a package typically comprises a substrate and a cover.
  • raw data or “sensor data” refers to measurement outputs from the sensors which are not yet processed.
  • “Motion data” refers to processed sensor data. Processing may include applying a sensor fusion algorithm or applying any other algorithm such as determining context, gestures, orientation, or confidence value. In the case of the sensor fusion algorithm, data from one or more sensors are combined to provide an orientation of the device. Processor data for example may include motion data plus audio data plus vision data (video, still frame) plus touch/temp data plus smell/taste data .
  • ISSs integrated sensor systems
  • MEMS microelectromechanical systems
  • sensor subsystems for a user application which combine multiple sensor sensing types and capabilities (position, force, pressure, discrete switching, acceleration, angular rate, level, etc.), where that application may be biological, chemical, electronic, medical, scientific and/or other sensing application.
  • ISSs as used herein also are intended to provide improved sizing and physical structures which are oriented to become smaller with improved technological gains.
  • ISSs also have suitable biocompatibility, corrosion resistance, and electronic integration for applications in which they may be deployed.
  • the first substrate is attached to the second substrate through wafer bonding, as described in commonly owned U.S. Patent No. 7,104,129 (incorporated herein by reference) that simultaneously provides electrical connections and hermetically seals the MEMS devices.
  • This fabrication technique advantageously enables technology that allows for the design and manufacture of high performance, multi-axis, inertial sensors in a very small and economical package.
  • Integration at the wafer-level minimizes parasitic capacitances, allowing for improved signal-to-noise relative to a discrete solution.
  • Such integration at the wafer-level also enables the incorporation of a rich feature set which minimizes the need for external amplification.
  • FIG. 1 is an exemplary system diagram 100 for the device and system 110 herein having one or more embedded sensors 120 and a sensor hub 130 on a single chip 100, in accordance with one or more embodiments of the present invention.
  • the ISS 110 is capable of communicating with external sensors 140 and also capable of outputting information, such as processed sensor data, to another device 150.
  • the sensor hub 130 receives sensor data from sensors 120.
  • the sensors 120 may include sensing devices, electronic circuits for converting analog signals to digital signals, and capable of determining sensed activities and information. These activities for example could include but are not limited to sleeping, waking up, walking, running, biking, participating in a sport, walking on stairs, driving, flying, training, exercising cooking, watching a television, reading, working at a computer, and eating.
  • sensors could be utilized for determining sensed locations.
  • these locations include but are not limited to a home, a workplace, a moving vehicle, indoor, outdoor, a meeting room, a store, a mall, a kitchen, a living room, and bedroom.
  • signals from a global positioning system (GPS) or other wireless system that generates location data could be utilized.
  • the sensors could send data to a GPS or other wireless system that generates location data to aid in low power location and navigation.
  • Sensors may include those devices which are capable of gathering data and/or information involving measurements concerning an accelerometer, gyroscope, compass, pressure, microphone, humidity, temperature, gas, chemical, ambient light, proximity, touch, and tactile information, for example; however the present invention is not so limited. Sensors of the present invention are embedded sensors for those sensors on the chip and/or external to the ISS for sensed data external to the chip, in one or more embodiments. From FIG. 1, the ISS 110 processes signals from sensors 120, 140 and outputs 150 to any other output device or to another device for further processing.
  • the output device is one or more of an application processor, memory, an audio output device, a haptic sensor and a LED.
  • the sensors are a MEMS sensor or a solid state sensor, though the sensors of the device and system may be any type of sensor.
  • the present invention may use data sensed from sensors including but not limited to a 3-axis accelerometer, 3-axis gyroscope, 3-axis magnetometer, pressure sensor, microphone, chemical sensor, gas sensor, humidity sensor, image sensor, ambient light, proximity, touch, and audio sensors, etc.
  • a gyroscope of the present invention includes the gyroscope disclosed and described in commonly-owned U.S. Patent No. 6,892,575, entitled "X-Y Axis Dual-Mass Tuning Fork Gyroscope with Vertically Integrated Electronics and Wafer-Scale Hermetic Packaging", which is incorporated herein by reference.
  • the gyroscope of the present invention is a gyroscope disclosed and described in the commonly-owned U.S. Patent Application No. 13/235,296, entitled“Micromachined Gyroscope Including a Guided Mass System”, also incorporated herein by reference.
  • the pressure sensor of the present invention is a pressure sensor as disclosed and described in the commonly- owned U.S. Patent Application No. 13/536,798, entitled“Hermetically Sealed MEMS Device with a Portion Exposed to the Environment with Vertically Integrated Electronics,” incorporated herein by reference.
  • a further embodiment of the present invention includes the sensors are formed on a MEMS substrate, the electronic circuits are formed on a CMOS substrate, the CMOS and the MEMS substrates are vertically stacked and attached is disclosed and described in commonly-owned U.S. Patent No. 8,250,921, entitled“Integrated Motion Processing Unit (MPU) With MEMS Inertial Sensing And Embedded Digital Electronics”
  • MPU Integrated Motion Processing Unit
  • FIG. 2A is an exemplary integrated sensor system (ISS) of the present invention having one or more embedded sensors in one or more MEMS chip 214 and one or more CMOS chip 212 with electronic circuits, attached to a substrate 206 to form a single chip 200, in accordance with one or more embodiments of the present invention.
  • the electronic circuits may include circuitry for sensing signals from sensors, processing the sensed signals and converting to digital signals.
  • FIG. 1 is an exemplary integrated sensor system (ISS) of the present invention having one or more embedded sensors in one or more MEMS chip 214 and one or more CMOS chip 212 with electronic circuits, attached to a substrate 206 to form a single chip 200, in accordance with one or more embodiments of the present invention.
  • the electronic circuits may include circuitry for sensing signals from sensors, processing the sensed signals and converting to digital signals.
  • ISS of the present invention having a first arrangement of a MEMS 214 arranged with a CMOS 212 vertically, and a second arrangement of a chip 202 vertically stacked with a chip 204, where the first and second arrangement are side-by-side on a substrate 206.
  • Chip 202 and chip 204 can be any combination of CMOS and MEMS. In another embodiment, chip 202 may not be present. Yet, in another embodiment, multiple chips such as 202 or 204 may be stacked. In some embodiments, CMOS chip may also include memory.
  • FIG. 2B is an exemplary integrated sensor system (ISS) 300 of the present invention having one or more MEMS chip 302 and one or more CMOS chip 304 vertically stacked and bonded 303 on a substrate 306, in accordance with one or more embodiments of the present invention.
  • the combined MEMS and CMOS chips are bonded or connected by solder balls to block 305 and then bonded to the substrate 306.
  • block 305 could be any of or any combination of electronics, sensors or solid state devices such as batteries.
  • FIG. 2C is an exemplary integrated sensor system (ISS) 350 of the present invention having one MEMS chip 302 and a plurality of CMOS chips 304A-304C are vertically stacked and CMOS chip 304A is wire bonded to CMOS chip 304B which is wire bonded to CMOS chip 304C.
  • the CMOS chip 304C in turn is wire bonded to a substrate 306, in accordance with one or more embodiments of the present invention.
  • the CMOS chips 304A, 304B and 304C could contain any of or any combination of electronic circuits.
  • this present invention relates to integrated systems arranged to include microelectromechanical systems (MEMS) that provide for signal processing and more particularly for those systems that provide for the processing of signals from sensors and also provide for the outputting of information from the processed signals to other devices, applications and arrangements.
  • MEMS microelectromechanical systems
  • the application relates to integrated sensor systems (ISSs) comprising one or more embedded sensors and a sensor hub arranged on a single chip, which can also receive inputs from external sensor sources and provide for facilitating efficient communication among the sensors for improved high-level features, such as interpreting gestures or actions according to the context.
  • ISSs integrated sensor systems
  • the present invention provides for an ISS implemented in a single chip that can be mounted onto a surface of a printed circuit board (PCB).
  • the ISS of the present invention comprises one or more MEMS chip having one or more sensors attached to one or more CMOS chips with electronic circuitry.
  • one or more MEMS chips and one or more CMOS chips are vertically stacked and bonded.
  • an ISS of the present invention provides for having more than one MEMS and more than one CMOS chips arranged and placed side-by-side.
  • FIG. 3 depicts a system diagram 400 of the ISS 405 in which the sensor hub 450 comprises one or more analog to digital convertors 451, one or more processors (455-457), memory 452, a power management block 453 and a controller block 454, in accordance with one or more embodiments of the present invention.
  • the sensor hub 450 comprises one or more analog to digital convertors, one or more processors, memory, one or more power management blocks and one or more controller blocks.
  • the one or more processors 455-457 include but are not limited to any and any combination of an audio processor, an image processor, a motion processor, a touch processor, a location processor, a wireless processor, a radio processor, a graphics processor, a power management processor, an environmental processor, an application processor (AP), and a microcontroller unit (MCU).
  • Any of the one or more processors 455-457 or external sensors 470 can provide one or more interrupts to an external device, any of the embedded or external sensors, or any processor or the like based upon the sensor inputs.
  • the interrupt signal can performs any of or any combination of wake-up a processor and/or sensor from a sleep state, initiate transaction between memory and sensor, initiate transaction between memory and processor, initiate transfer of data between memory and external device.
  • the sensor hub may include in some embodiments a real-time clock (RTC), a system clock oscillator or any other type of clock circuitry.
  • RTC real-time clock
  • resonators for the clocks can be implemented with MEM structure. In so doing external crystal resonators are not required thereby saving cost, reducing power requirements and reducing the overall size of the device.
  • Embodiments of the sensor hub described herein can take the form of an entirely hardware implementation, an entirely software implementation, or an implementation containing both hardware and software elements. Embodiments may be implemented in software, which includes, but is not limited to, application software, firmware, resident software, microcode, etc. [0051] The steps described herein may be implemented using any suitable controller or processor, and software application, which may be stored on any suitable storage location or computer-readable medium. The software application provides instructions that enable the processor to cause the receiver to perform the functions described herein.
  • embodiments may take the form of a computer program product accessible from a computer-usable or computer-readable medium providing program code for use by or in connection with a computer or any instruction execution system.
  • a computer-usable or computer-readable medium can be any apparatus that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • the medium may be an electronic, magnetic, optical, electromagnetic, infrared, semiconductor system (or apparatus or device), or a propagation medium.
  • Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk, and an optical disk.
  • Current examples of optical disks include DVD, compact disk-read-only memory (CD-ROM), and compact disk– read/write (CD-R/W).
  • the ISS 405 receives inputs from one or more sensor sets (410, 420, 430).
  • a sensor set as used herein may include a single sensor or be an arrangement of a plurality of sensors, none of which are required to be of the same or similar type or utility and none of which are required to not be of a same or similar type and utility.
  • a sensor set, or grouping may include or be determined in relation to one or more of the type of sensors, the type of application intended, the type of application the sensor is to be connected or in communication with, etc. It will be appreciated by those skilled in the art that the present invention is not constrained or limited to a particular arrangement of sensor in a specific manner to constitute a grouping herein.
  • sensor set 1 (410) includes a 3-axis accelerometer, 3 axis gyroscope, and a 3-axis magnetometer.
  • Sensor set 2 (420) includes certain sensors exposed to the environment such as a pressure sensor, a microphone, a chemical sensor, a gas sensor, a humidity sensor, etc.
  • Sensor set 3 (430) includes certain sensors being one or more of ambient light, proximity, touch, and audio-based sensors.
  • each of the sets of sensors is connected to a dedicated processor, where the connected processor is a general purpose processor.
  • each of the sets of sensors is connected to a dedicated processor, where the connected processor is a specialized processor, such as that required, by example, for an audio processor to process audio input.
  • each of the sets of sensors is arranged in relation to the processor to which it connects.
  • each of the processors of the present invention can execute various sensor fusion algorithms in which the sensor fusion algorithms are algorithms that combine various sensor inputs to generate one or more of the orientation of the device or combined sensors data that may then be used for further processing or any other actions as appropriate such as determining orientation of the user.
  • the sensor hub 450 provides for facilitating efficient communication among the sensors for improved high-level features.
  • the sensor hub is capable of recognizing gestures and trigger sensors that are turned off/on or trigger processors.
  • the sensor hub is capable of performing intelligent sensor fusion in one or more aspects.
  • the present invention is capable of combining data from light, enabling proximity and motion sensors to thereby trigger resulting in the sending of data from a microphone to an audio processor (AP).
  • the sensor hub is capable of processing sensor inputs and output signals that actuate haptic sensors (i.e., tactile feedback technology which takes advantage of the sense of touch by applying forces, vibrations, or motions to the user).
  • the output signals can be one or more of audio, vibration or light (LED).
  • the power management block 453 performs power management operations across all sensor sets, including external sensors 470.
  • the power management block is capable of turning off or turning on a sensor based on other sensor inputs or input from the application processor (AP).
  • the power management block is further capable of putting the device or processors in a low power mode based on the one or more sensors.
  • the power management block is further capable of applying a low power mode to one or more sensors based on one or more other sensors.
  • the power management block is capable of turning on the microphone.
  • the ambient light sensor senses low light in an environment and the sensed low light situation is then combined with accelerometer measurements, the device may be set or otherwise configured for sleep mode.
  • the memories 452a-452d can store raw sensor data from sensors, including those of the external sensors.
  • the motion data or processed data is also stored in the memory.
  • memories 452a-452d as used herein can include single port or multiport SRAM, DRAM or FIFO, for instance.
  • a first memory can reside in ISS 405 outside sensor hub 450 in addition to memories 452a-452d to store any of sensor data, motion data and instructions.
  • a second memory can reside external to ISS 405 to store sensor data, motion data and instructions.
  • the controller block 454 of FIG.3 includes control logic for the sensor hub 450.
  • the controller block also includes a bus master.
  • the bus master not pictured, manages the data storage from sensors and also provides for the storing of data from the processors.
  • the sensor hub of the present invention is capable of receiving measurements from more than one sensor to determine the "context" of the user’s actions. In the embodiment, the sensor hub is capable of then interpreting gestures or actions according to the context.
  • Context can be established in a variety of ways. In one example, location can establish the context. The context could be established based on the way a system is connected (GPS, local Wi-Fi etc.) of the device to be controlled are connected.
  • a state of the device to be controlled establishes the context. For example, if a device that includes the ISS has browser page open, this could for example mean a context to enable "air-mouse" type functionality on a wearable device is established. This state could be as simple as the device being turned on.
  • a system and method in accordance with the present invention can be implemented in varying contexts and situations.
  • a location defined the context for the operation of the ISS of the present invention.
  • the implementation could be based on inertial sensors providing location information or the way in which the system is connected (such as with localized WI-FI or via another connection method) where all the devices to be controlled are connected similarly, irrespective of the WI-FI source, etc.
  • an implementation could be based on the state of the device to be controlled as defining the context. For example, in an implementation involving a television having a browser page open, a context to enable "air-mouse" type functionality on the wearable device could be established. In such an implementation, the state could simply be the device being turned ON or OFF.
  • an implementation could be based on time as defining the context. For example, in an implementation involving a determination as to whether it is day or night to enable a light on/off functionality.
  • an implementation could be based on proximity as defining the context. For example, in an implementation an ISS providing information about proximity to a device could be used as context.
  • an implementation could be based on a picture of the device to be controlled as defining the context.
  • a picture of the device could be a used as a context such as in the situation where the wearable device takes the form of computer-based glasses for instance.
  • an implementation could be based on a device being turned ON or OFF as defining the context.
  • a device turning ON one sensor
  • such could further be associated with a proximity to the device (another sensor).
  • an implementation could be based on a device being activated by another independent act as defining the context. For example, in an implementation involving a phone ringing, as such is triggered by a calling in to a line from the act of another, such could further be associated with lowering volumes or turning off those associated remote devices that are active at the time of the phone ringing.
  • an implementation could be based on being able to access a device’s actuation as defining the context. For example, in an implementation involving a garage door, even in the event where a car within the garage is being stolen, the thief is unable to open the garage door absent having control over a device that includes an ISS which enables the door to open or close. [0074] Further, in other aspects, an implementation could be based on a user’s situation as defining the context.
  • the sensors of the ISS could establish Turn-off/Turn-on features on one or more remote devices (e.g., auto alarm the house, control thermostat, CO-Alarm, smoke detector, etc.).
  • one or more remote devices e.g., auto alarm the house, control thermostat, CO-Alarm, smoke detector, etc.
  • an implementation could be based on a context of a social gathering at a predetermined location.
  • a social event having a series of predetermined timed events where each event has multiple remote devices engaged to be activated to perform a function (e.g., streamers release, music, lights, microphone, etc.)
  • each remote device is configured to be active only during pre-set periods and each device is also configured to recognize and receive specific commands from gestures or movements from a device that includes the ISS.
  • a user can control certain of the remote device independent from another and other dependent with one another, without manually resetting or engaging others at additional costs to operate the event.
  • the primary sensors may be motion sensors that allow the user to know a person has entered the room that information may engage a video camera and a microphone at the remote location that allows the user to see and communicate with who has entered.
  • additional sensors may be used to provide information about which room is being utilized for the meeting as well as the identity of all the attendees to provide more context.
  • the above description is by way of example only and one of ordinary skill in the art recognizes that any or any combination of sensors can provide context information and generally the more different types of sensors that are available will improve the context for a user.
  • the sensors in the ISS along with the algorithm in the memory can detect basic units such as a velocity, acceleration, gravity, elevation, environmental motion/vibrations, background noise, audio signature, detecting keywords, images, video, motion gestures, image gesture, ambient light, body temperature, ambient temperature, humidity, rotation, orientation, heading, ambient pressure, air quality, and flat tire detection.
  • the air quality can be the amount of oxygen (O2), carbon dioxide (Co2) or a particle count.
  • this present invention relates to system architectures configured to process sensor data hierarchically. Two or more processing levels may be provided so that sensor data processed at a lower level may be output to an upper level for further processing or other operation involving the processed data from the lower level. At least one processor is provided at each processing level and, as desired, may be implemented as an ISS comprising one or more embedded sensors, as a processor receiving inputs from external sensor sources or as any other processor and sensor configuration. As will be described below, such an architecture may facilitate efficient communication among the sensors for improved high-level features, such as interpreting gestures or actions according to the context.
  • FIG. 4 schematically depicts an exemplary diagram of a system 500 configured to process sensor data hierarchically.
  • system 500 may involve at least two hierarchical processing levels, such as first processing level 502 and second processing level 504.
  • first processing level 502 features sensor hub 506, which has processor 508, memory 510 and at least one embedded sensor 512.
  • sensor hub 506 has processor 508, memory 510 and at least one embedded sensor 512.
  • processors at processing level 502 may receive sensor data input from any suitable source, such as an embedded internal sensor 512 or an external sensor 520.
  • Each processor 508 and 516 may independently perform one or more operations on the received sensor data. As will be appreciated, a variety of operations may be performed on the sensor data, including aggregation, sensor fusion, gesture recognition and other suitable algorithms for processing sensor data.
  • first processing level 502 is shown as receiving raw sensor data, such as from internal sensor 512 or external sensor 520, one or more processors at first processing level 502 processed sensor data may receive processed sensor data from a lower hierarchical level.
  • Sensor hubs 506 and 514 may be configured to output processed sensor data to second processing level 504 after performing one or more operations with processors 508 and 516, respectively.
  • second processing level 504 includes sensor hub 522 having processor 524 and memory 526 to receive the processed sensor data output from first processing level 502.
  • Processor 524 may perform one or more operations on the output processed sensor data, such as those described above.
  • Raw sensor data may also be received for processing at second processing level 504.
  • sensor hub 522 includes embedded sensor 528, which may output data to processor 524.
  • Raw sensor data may also be provided to second processing level 504 from sensor hub 530 having memory 532 to aggregate data from embedded sensor 534 or other externally implemented sensor.
  • second processing level 504 may also receive processed sensor data from a different hierarchical level.
  • Second processing level 504 may be configured to output processed sensor data from processor 524 to third processing level 536, which in this embodiment includes application processor 538.
  • third processing level 536 may represent the top of the hierarchy, but additional processing levels may be provided as desired.
  • Processed sensor data output by second processing level 504 at least includes the results of processor 524 performing one or more operations on data received from first processing level 502, but may also include the results of processor 524 performing one or more operations on raw sensor data, such as received from sensor 528 or sensor hub 530, or on processed sensor data received from a different hierarchical level.
  • first processing level 502 may receive raw motion sensor data, such as from embedded sensor 512 that may include a gyroscope and/or an accelerometer.
  • Processor 508 may be configured to recognize a pattern of raw motion sensor data corresponding to a specific context as describe above, such as one step of a user’s stride in a pedometer application. Consequently, processor 508 may output information to second processing level 504 each time a step is recognized.
  • each processing level may therefore increase the information density of the data bits used at each level.
  • the techniques of this disclosure may be applied to perform power management operations with respect to various components of the system, including one or more sensors or sensor sets and/or processors.
  • the implementation of power management may be performed with respect to each component individually and/or independently of other components.
  • a power mode at one processing level may be changed depending on an operation performed at another processing level.
  • sensors and/or processors at first processing level 502 may be operated at a reduced power level, outputting a reduced set of sensor data until triggered by an operation occurring at second processing level 504, such as recognition of a gesture or other context.
  • second processing level 504 may be configured trigger a reduction in power at first processing level 502, such as after identify a suitable period of inactivity or cessation of a current context.
  • a lower hierarchical processing level may also implement a power management change at an upper processing level.
  • first processing level 502 may be configured to recognize a gesture using raw sensor data received at that level and correspondingly activating or deactivating components at second processing level 504 or another hierarchical level.
  • any operation occurring at one processing level may be used as a trigger to initiate an action occurring at another processing level.
  • System 500 may be implemented as a single device as desired or any number of processing levels may be individually implemented by discrete device that communicate with one another.
  • system 500 may be a self-contained device such as a mobile phone (e.g., cellular phone, a phone running on a local network, or any other telephone handset), personal digital assistant (PDA), tablet, video game player, video game controller, navigation device, mobile internet device (MID), personal navigation device (PND), digital still camera, digital video camera, binoculars, telephoto lens, portable music, video, or media player, remote control, or other handheld device, or a combination of one or more of these devices.
  • PDA personal digital assistant
  • MID mobile internet device
  • PND personal navigation device
  • digital still camera digital video camera
  • binoculars telephoto lens
  • portable music, video, or media player, remote control, or other handheld device or a combination of one or more of these devices.
  • system 500 may include a plurality of devices, such as one of those noted above, or a non-portable device such as a desktop computer, electronic tabletop device, server computer, etc., any of which can communicate with each other using any suitable method, e.g., via any type of wire-based communication protocol (e.g., serial transmissions, parallel transmissions, packet-based data communications), wireless connection (e.g., electromagnetic radiation, infrared radiation or other wireless technology), or a combination of one or more wired connections and one or more wireless connections.
  • wire-based communication protocol e.g., serial transmissions, parallel transmissions, packet-based data communications
  • wireless connection e.g., electromagnetic radiation, infrared radiation or other wireless technology
  • FIG. 5 schematically represents an embodiment of the disclosure in which processing levels are implemented in separate devices.
  • first processing level 502 may be implemented in wrist band 550
  • second processing level 504 may be implemented in smart phone 552
  • third processing level may be implemented in server 554.
  • wrist band 550 may include external or embedded motion sensors that output raw gyroscope and accelerometer data.
  • First processing level 502 may be configured to recognize a pattern of the raw motion data as corresponding to a step.
  • wrist band 550 may then output to smart phone 552 a condensation of the raw motion sensor data in the form of indicating the user has taken a step.
  • second processing level 504 may utilize information about the steps by aggregating data from first processing level 502 and performing further operation, such as computing distance, velocity or the like. Smart phone 552 may then output the further processed data to server 554, such as for fitness tracking or navigation.
  • FIG. 6 depicts a flowchart showing a process for processing sensor data hierarchically.
  • sensor data may be received at a first processing level.
  • sensor data received at the first processing level may be raw sensor data, such as from an embedded sensor or an external sensor, or may be sensor data processed at a lower hierarchical level.
  • One or more operations may be performed on the received sensor data by the first processing level as indicated by 602.
  • the first processing level then outputs the processed sensor data to a second processing level in 604.
  • the second processing level performs one or more operations on the processed sensor data and in 608, outputs the result to a third processing level.

Abstract

The present invention is directed toward a device and system having a sensor hub capable of receiving measurement outputs from a plurality of sensors and processing the measurements for output to other devices, such as by using a single chip arrangement. The sensor hub provides for facilitating efficient communication among the sensors for improved high-level features, such as interpreting gestures or actions according to the context. Two or more hierarchical processing levels may be provided so that sensor data processed at a lower level is output to an upper level for further processing or other operation involving the processed data from the lower level.

Description

SYSTEM AND METHOD FOR HIERARCHICAL SENSOR PROCESSING
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority to and benefit of U.S. Patent Application Serial No. 14/480,364, filed September 8, 2014, entitled“System and Method for Hierarchical Sensor Processing,” (Atty. Docket No. IVS-381), which is a continuation-in-part of U.S. Patent Application Serial No. 14/201,729, filed March 7, 2014 (Atty. Docket No. IVS- 200), and claims the benefit of U.S. Provisional Application No.61/791,331, filed March 15, 2013, all of which are hereby incorporated by reference in their entirety.
FIELD OF THE INVENTION
[0002] This present invention relates to integrated systems, such as those that may be arranged to include microelectromechanical systems (MEMS) that provide for signal processing, and more particularly for systems that provide for the processing of signals from sensors and also provide for the outputting of information from the processed signals to other devices, applications and arrangements.
BACKGROUND
[0003] Receiving measurement outputs from a plurality of sensors and processing the measurements for a user’s requirements often involves complexity in understanding user needs, sensors to be integrated and in communication with multiple sourced devices, and complicated configuring of protocols between applications. Accordingly, what is needed is a device and system that is able to facilitate efficient communication among the sensors to be used for data acquisition which is also able to provide processing of the received data to meet user needs. Similarly, it is also desired that the capability to provide for interpreting complicated sensed actions.
[0004] Accordingly, the present invention addresses such a need and solution and is directed to such a need in overcoming the prior limitations in the field.
SUMMARY
[0005] According to one or more embodiments of the present disclosure,, a method for processing sensor data hierarchically may include receiving sensor data input at a first processing level, performing a first operation on the received sensor data input at the first processing level, outputting processed sensor data from the first processing level to a second processing level, performing a second operation on the processed sensor data at the second processing level and outputting a result from the second operation to a third processing level. Receiving sensor data input at the first processing level may include receiving sensor data from at least one embedded sensor that is integrated with a processor, from at least one embedded sensor that is integrated with memory, receiving raw sensor data from at least one external sensor and/or receiving processed sensor data from another hierarchical processing level.
[0006] In one aspect, a plurality of independent processors may be provided at the first processing level, such that each processor may perform an operation on received sensor data.
[0007] In one aspect, sensor data input may be received at the second processing level from a least one embedded sensor that is integrated with a processor. [0008] In one aspect, at least one of the first operation and the second operation may include aggregating sensor data. Further, at least one of the first operation and the second operation comprises processing sensor data to increase an amount of information per bit and the amount of information per bit may be increased at both the first and the second processing levels.
[0009] In one aspect, sensor data input at the second processing level may be received from an external sensor.
[0010] In one aspect, a plurality of independent processors may be provided at the second processing level, such that each processor may perform an operation on processed sensor data output by another hierarchical processing level.
[0011] In one aspect, power management may be independently implemented at the first and second processing levels. Further, a power mode of one processing level may be changed based, at least in part, on a result of an operation at another processing level. Alternatively or in addition, one processing level may be transitioned between a power save mode and an active mode based, at least in part, on an operation performed at another processing level. Still further, an action at one processing level may be triggered based, at least in part, on an operation performed at another processing level.
[0012] In one aspect, the sensor data input received at one processing level comprises data from a set of sensors.
[0013] This disclosure may also include a system for processing sensor data, having at least one processor of a first processing level configured to receive sensor data input and perform a first operation on the received sensor data, at least one processor of a second processing level configured to receive processed sensor data from the first processing level and perform a second operation on the processed sensor data and at least one processor of a third processing level configured to receive a result of the second operation from the second processing level.
[0014] In one aspect, the system may include at least one embedded sensor that is integrated with a processor of the first processing level providing the sensor data input. The system may also include at least one embedded sensor that is integrated with memory providing the sensor data input. Further, the system may include at least one external sensor providing the sensor data input. Still further, the system may include another hierarchical level providing the sensor data input.
[0015] In one aspect, the system may have a plurality of independent processors of the first processing level, such that each processor is configured to perform an operation on received sensor data.
[0016] In one aspect, the system may have at least one embedded sensor that is integrated with a processor of the second processing level and is configured to output sensor data to the second processing level.
[0017] In one aspect, at least one of the first operation and the second operation comprises aggregating sensor data. Further, at least one of the first operation and the second operation may include processing sensor data to increase an amount of information per bit and both first and the second processing levels may be configured to increase an amount of information per bit. [0018] Further, the system may include at least one external sensor that is configured to output sensor data to the second processing level. In addition, the system may have a plurality of independent processors at the second processing level, such that each processor may perform an operation on processed sensor data output by another hierarchical processing level.
[0019] In one aspect, the system may include a power management block configured to independently control the first and second processing levels. The power management block may change a power mode of one processing level based, at least in part, on a result of an operation at another processing level. The power management block may also transition one processing level between a power save mode and an active mode based, at least in part, on an operation performed at another processing level.
[0020] In one aspect, one processing level may perform an action based, at least in part, on an operation performed at another processing level.
[0021] Additional embodiments of the present disclosure provide for a device and system having a plurality of sensors and a sensor hub coupled to the plurality of sensors, for receiving outputs from the plurality of sensors to be implemented in computer programmable software and stored in computer readable media.
[0022] The above and/or other aspects, features and/or advantages of various embodiments will be further appreciated in view of the following description in conjunction with the accompanying figures. Various embodiments can include and/or exclude different aspects, features and/or advantages where applicable. In addition, various embodiments can combine one or more aspect or feature of other embodiments where applicable. The descriptions of aspects, features and/or advantages of particular embodiments should not be construed as limiting other embodiments or the claims.
BRIEF DESCRIPTION OF THE DRAWINGS
[0023] FIG.1 is an exemplary system diagram for the device and system herein having one or more embedded sensors and a sensor hub on a single chip, in accordance with one or more embodiments of the present invention.
[0024] FIG. 2A is an exemplary integrated sensor system (ISS) of the present invention having one or more embedded sensors in one or more MEMS chips and one or more CMOS chips with electronic circuits, in a single chip, in accordance with one or more embodiments of the present invention.
[0025] FIG. 2B is an exemplary integrated sensor system (ISS) of the present invention having one or more MEMS chips and one or more CMOS chips vertically stacked and bonded on a substrate, in accordance with one or more embodiments of the present invention.
[0026] FIG. 2C is an exemplary integrated sensor system (ISS) of the present invention having one or more MEMS chips and one or more CMOS chips vertically stacked and bonded on a substrate, in accordance with one or more embodiments of the present invention
[0027] FIG.3 depicts a system diagram of the ISS in which the sensor hub comprises one or more analog to digital convertors, one or more processors, memory, a power management block and a controller block, in accordance with one or more embodiments of the present invention.
[0028] FIG. 4 schematically depicts a system architecture of sensor hubs for the hierarchical processing of sensor data, in accordance with one or more embodiments of the present invention.
[0029] FIG.5 schematically depicts a system having processing levels implemented in separate devices, in accordance with one or more embodiments of the present invention.
[0030] FIG. 6 schematically depicts a flow chart showing a routine for the hierarchical processing of sensor data, in accordance with one or more embodiments of the present invention.
DETAILED DESCRIPTION
[0031] The present invention relates to integrated systems arranged to include microelectromechanical systems (MEMS) that provide for signal processing and more particularly for those systems that provide for the processing of signals from sensors and also provide for the outputting of information from the processed signals to other devices, applications and arrangements. Further, the application relates to integrated sensor systems (ISSs) comprising one or more embedded sensors and a sensor hub arranged on a single chip, which can also receive inputs from external sensor sources and provide for facilitating efficient communication among the sensors for improved high-level features, such as interpreting gestures or actions according to the context. The application also relates to systems and methods for the hierarchical processing of sensor data, including sensor data from an embedded sensor in an ISS or from other sensor configurations.
[0032] The following description is presented to enable one of ordinary skill in the art to make and use the invention and is provided in the context of a patent application and its requirements. Various modifications to the embodiments and the generic principles and features described herein will be readily apparent to those skilled in the art. Thus, the present invention is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features described herein.
[0033] In the described embodiments, Micro-Electro-Mechanical Systems (MEMS) refers to a class of devices fabricated using semiconductor-like processes and exhibiting mechanical characteristics such as the ability to move or deform. MEMS often, but not always, interact with electrical signals. Silicon wafers containing MEMS structures are referred to as MEMS wafers. MEMS device may refer to a semiconductor device implemented as a micro-electro-mechanical system. A MEMS device includes mechanical elements and optionally includes electronics for sensing. MEMS devices include but not limited to gyroscopes, accelerometers, magnetometers, and pressure sensors. MEMS features refer to elements formed by MEMS fabrication process such as bump stop, damping hole, via, port, plate, proof mass, standoff, spring, seal ring, proof mass. MEMS structure may refer to any feature that may be part of a larger MEMS device. One or more MEMS features comprising moveable elements are a MEMS structure. Integrated Circuit (IC) substrate may refer to a silicon substrate with electrical circuits, typically CMOS circuits. A chip includes at least one substrate typically formed from a semiconductor material. A single chip may be formed from multiple substrates, where the substrates are mechanically bonded to preserve the functionality. Multiple chips includes at least 2 substrates, wherein the 2 substrates are electrically connected, but do not require mechanical bonding. A package provides electrical connection between the bond pads on the chip to a metal lead that can be soldered to a PCB. A package typically comprises a substrate and a cover.
[0034] In the described embodiments, “raw data” or “sensor data” refers to measurement outputs from the sensors which are not yet processed.“Motion data” refers to processed sensor data. Processing may include applying a sensor fusion algorithm or applying any other algorithm such as determining context, gestures, orientation, or confidence value. In the case of the sensor fusion algorithm, data from one or more sensors are combined to provide an orientation of the device. Processor data for example may include motion data plus audio data plus vision data (video, still frame) plus touch/temp data plus smell/taste data .
[0035] As used herein, integrated sensor systems (ISSs) comprise microelectromechanical systems (MEMS) and sensor subsystems for a user’s application which combine multiple sensor sensing types and capabilities (position, force, pressure, discrete switching, acceleration, angular rate, level, etc.), where that application may be biological, chemical, electronic, medical, scientific and/or other sensing application. ISSs as used herein also are intended to provide improved sizing and physical structures which are oriented to become smaller with improved technological gains. Similarly, as used here, ISSs also have suitable biocompatibility, corrosion resistance, and electronic integration for applications in which they may be deployed.
[0036] In an embodiment of the invention, the first substrate is attached to the second substrate through wafer bonding, as described in commonly owned U.S. Patent No. 7,104,129 (incorporated herein by reference) that simultaneously provides electrical connections and hermetically seals the MEMS devices. This fabrication technique advantageously enables technology that allows for the design and manufacture of high performance, multi-axis, inertial sensors in a very small and economical package.
Integration at the wafer-level minimizes parasitic capacitances, allowing for improved signal-to-noise relative to a discrete solution. Such integration at the wafer-level also enables the incorporation of a rich feature set which minimizes the need for external amplification.
[0037] FIG. 1 is an exemplary system diagram 100 for the device and system 110 herein having one or more embedded sensors 120 and a sensor hub 130 on a single chip 100, in accordance with one or more embodiments of the present invention. In an embodiment, the ISS 110 is capable of communicating with external sensors 140 and also capable of outputting information, such as processed sensor data, to another device 150.
[0038] Operationally, the sensor hub 130 receives sensor data from sensors 120. The sensors 120 may include sensing devices, electronic circuits for converting analog signals to digital signals, and capable of determining sensed activities and information. These activities for example could include but are not limited to sleeping, waking up, walking, running, biking, participating in a sport, walking on stairs, driving, flying, training, exercising cooking, watching a television, reading, working at a computer, and eating.
[0039] Furthermore the sensors could be utilized for determining sensed locations. For example, these locations include but are not limited to a home, a workplace, a moving vehicle, indoor, outdoor, a meeting room, a store, a mall, a kitchen, a living room, and bedroom.
[0040] In such an embodiment signals from a global positioning system (GPS) or other wireless system that generates location data could be utilized. In addition the sensors could send data to a GPS or other wireless system that generates location data to aid in low power location and navigation.
[0041] Sensors may include those devices which are capable of gathering data and/or information involving measurements concerning an accelerometer, gyroscope, compass, pressure, microphone, humidity, temperature, gas, chemical, ambient light, proximity, touch, and tactile information, for example; however the present invention is not so limited. Sensors of the present invention are embedded sensors for those sensors on the chip and/or external to the ISS for sensed data external to the chip, in one or more embodiments. From FIG. 1, the ISS 110 processes signals from sensors 120, 140 and outputs 150 to any other output device or to another device for further processing. For example, in an embodiment, the output device is one or more of an application processor, memory, an audio output device, a haptic sensor and a LED.
[0042] In another embodiment, the sensors are a MEMS sensor or a solid state sensor, though the sensors of the device and system may be any type of sensor. For instance, it is envisioned that the present invention may use data sensed from sensors including but not limited to a 3-axis accelerometer, 3-axis gyroscope, 3-axis magnetometer, pressure sensor, microphone, chemical sensor, gas sensor, humidity sensor, image sensor, ambient light, proximity, touch, and audio sensors, etc.
[0043] In a further embodiment, a gyroscope of the present invention includes the gyroscope disclosed and described in commonly-owned U.S. Patent No. 6,892,575, entitled "X-Y Axis Dual-Mass Tuning Fork Gyroscope with Vertically Integrated Electronics and Wafer-Scale Hermetic Packaging", which is incorporated herein by reference. In another embodiment, the gyroscope of the present invention is a gyroscope disclosed and described in the commonly-owned U.S. Patent Application No. 13/235,296, entitled“Micromachined Gyroscope Including a Guided Mass System”, also incorporated herein by reference. In yet a further embodiment, the pressure sensor of the present invention is a pressure sensor as disclosed and described in the commonly- owned U.S. Patent Application No. 13/536,798, entitled“Hermetically Sealed MEMS Device with a Portion Exposed to the Environment with Vertically Integrated Electronics,” incorporated herein by reference.
[0044] In a further embodiment of the present invention includes the sensors are formed on a MEMS substrate, the electronic circuits are formed on a CMOS substrate, the CMOS and the MEMS substrates are vertically stacked and attached is disclosed and described in commonly-owned U.S. Patent No. 8,250,921, entitled“Integrated Motion Processing Unit (MPU) With MEMS Inertial Sensing And Embedded Digital Electronics”
[0045] FIG. 2A is an exemplary integrated sensor system (ISS) of the present invention having one or more embedded sensors in one or more MEMS chip 214 and one or more CMOS chip 212 with electronic circuits, attached to a substrate 206 to form a single chip 200, in accordance with one or more embodiments of the present invention. In the described embodiments, the electronic circuits may include circuitry for sensing signals from sensors, processing the sensed signals and converting to digital signals. In an embodiment, FIG. 2a also provides for an ISS of the present invention having a first arrangement of a MEMS 214 arranged with a CMOS 212 vertically, and a second arrangement of a chip 202 vertically stacked with a chip 204, where the first and second arrangement are side-by-side on a substrate 206. Chip 202 and chip 204 can be any combination of CMOS and MEMS. In another embodiment, chip 202 may not be present. Yet, in another embodiment, multiple chips such as 202 or 204 may be stacked. In some embodiments, CMOS chip may also include memory.
[0046] FIG. 2B is an exemplary integrated sensor system (ISS) 300 of the present invention having one or more MEMS chip 302 and one or more CMOS chip 304 vertically stacked and bonded 303 on a substrate 306, in accordance with one or more embodiments of the present invention. In an arrangement, the combined MEMS and CMOS chips are bonded or connected by solder balls to block 305 and then bonded to the substrate 306. In an embodiment block 305 could be any of or any combination of electronics, sensors or solid state devices such as batteries.
[0047] FIG. 2C is an exemplary integrated sensor system (ISS) 350 of the present invention having one MEMS chip 302 and a plurality of CMOS chips 304A-304C are vertically stacked and CMOS chip 304A is wire bonded to CMOS chip 304B which is wire bonded to CMOS chip 304C. The CMOS chip 304C in turn is wire bonded to a substrate 306, in accordance with one or more embodiments of the present invention. In an embodiment the CMOS chips 304A, 304B and 304C could contain any of or any combination of electronic circuits.
[0048] In one embodiment, this present invention relates to integrated systems arranged to include microelectromechanical systems (MEMS) that provide for signal processing and more particularly for those systems that provide for the processing of signals from sensors and also provide for the outputting of information from the processed signals to other devices, applications and arrangements. Further, the application relates to integrated sensor systems (ISSs) comprising one or more embedded sensors and a sensor hub arranged on a single chip, which can also receive inputs from external sensor sources and provide for facilitating efficient communication among the sensors for improved high-level features, such as interpreting gestures or actions according to the context. The present invention provides for an ISS implemented in a single chip that can be mounted onto a surface of a printed circuit board (PCB). In another embodiment, the ISS of the present invention comprises one or more MEMS chip having one or more sensors attached to one or more CMOS chips with electronic circuitry. In a further embodiment, one or more MEMS chips and one or more CMOS chips are vertically stacked and bonded. In yet another embodiment, an ISS of the present invention provides for having more than one MEMS and more than one CMOS chips arranged and placed side-by-side.
[0049] FIG. 3 depicts a system diagram 400 of the ISS 405 in which the sensor hub 450 comprises one or more analog to digital convertors 451, one or more processors (455-457), memory 452, a power management block 453 and a controller block 454, in accordance with one or more embodiments of the present invention. In an embodiment, the sensor hub 450 comprises one or more analog to digital convertors, one or more processors, memory, one or more power management blocks and one or more controller blocks. For example, the one or more processors 455-457 include but are not limited to any and any combination of an audio processor, an image processor, a motion processor, a touch processor, a location processor, a wireless processor, a radio processor, a graphics processor, a power management processor, an environmental processor, an application processor (AP), and a microcontroller unit (MCU). Any of the one or more processors 455-457 or external sensors 470 can provide one or more interrupts to an external device, any of the embedded or external sensors, or any processor or the like based upon the sensor inputs. The interrupt signal can performs any of or any combination of wake-up a processor and/or sensor from a sleep state, initiate transaction between memory and sensor, initiate transaction between memory and processor, initiate transfer of data between memory and external device. In addition, the sensor hub may include in some embodiments a real-time clock (RTC), a system clock oscillator or any other type of clock circuitry. In an embodiment, resonators for the clocks can be implemented with MEM structure. In so doing external crystal resonators are not required thereby saving cost, reducing power requirements and reducing the overall size of the device.
[0050] Embodiments of the sensor hub described herein can take the form of an entirely hardware implementation, an entirely software implementation, or an implementation containing both hardware and software elements. Embodiments may be implemented in software, which includes, but is not limited to, application software, firmware, resident software, microcode, etc. [0051] The steps described herein may be implemented using any suitable controller or processor, and software application, which may be stored on any suitable storage location or computer-readable medium. The software application provides instructions that enable the processor to cause the receiver to perform the functions described herein.
[0052] Furthermore, embodiments may take the form of a computer program product accessible from a computer-usable or computer-readable medium providing program code for use by or in connection with a computer or any instruction execution system. For the purposes of this description, a computer-usable or computer-readable medium can be any apparatus that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
[0053] The medium may be an electronic, magnetic, optical, electromagnetic, infrared, semiconductor system (or apparatus or device), or a propagation medium. Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk, and an optical disk. Current examples of optical disks include DVD, compact disk-read-only memory (CD-ROM), and compact disk– read/write (CD-R/W).
[0054] From FIG. 3, the ISS 405 receives inputs from one or more sensor sets (410, 420, 430). A sensor set as used herein may include a single sensor or be an arrangement of a plurality of sensors, none of which are required to be of the same or similar type or utility and none of which are required to not be of a same or similar type and utility. A sensor set, or grouping, may include or be determined in relation to one or more of the type of sensors, the type of application intended, the type of application the sensor is to be connected or in communication with, etc. It will be appreciated by those skilled in the art that the present invention is not constrained or limited to a particular arrangement of sensor in a specific manner to constitute a grouping herein.
[0055] For example in an embodiment, using FIG. 3 as an exemplar, sensor set 1 (410) includes a 3-axis accelerometer, 3 axis gyroscope, and a 3-axis magnetometer. Sensor set 2 (420) includes certain sensors exposed to the environment such as a pressure sensor, a microphone, a chemical sensor, a gas sensor, a humidity sensor, etc. Sensor set 3 (430) includes certain sensors being one or more of ambient light, proximity, touch, and audio-based sensors. In a further embodiment, each of the sets of sensors is connected to a dedicated processor, where the connected processor is a general purpose processor.
[0056] In yet a further embodiment, each of the sets of sensors is connected to a dedicated processor, where the connected processor is a specialized processor, such as that required, by example, for an audio processor to process audio input. In still another embodiment, each of the sets of sensors is arranged in relation to the processor to which it connects.
[0057] It will also be appreciated that each of the processors of the present invention can execute various sensor fusion algorithms in which the sensor fusion algorithms are algorithms that combine various sensor inputs to generate one or more of the orientation of the device or combined sensors data that may then be used for further processing or any other actions as appropriate such as determining orientation of the user.
[0058] Returning to FIG. 3, the sensor hub 450 provides for facilitating efficient communication among the sensors for improved high-level features. For example, in one or more embodiments, the sensor hub is capable of recognizing gestures and trigger sensors that are turned off/on or trigger processors. Similarly, the sensor hub is capable of performing intelligent sensor fusion in one or more aspects. For example, the present invention is capable of combining data from light, enabling proximity and motion sensors to thereby trigger resulting in the sending of data from a microphone to an audio processor (AP). Additionally, in one or more embodiments, the sensor hub is capable of processing sensor inputs and output signals that actuate haptic sensors (i.e., tactile feedback technology which takes advantage of the sense of touch by applying forces, vibrations, or motions to the user). Using the present invention, the output signals can be one or more of audio, vibration or light (LED).
[0059] From FIG. 3, the power management block 453 performs power management operations across all sensor sets, including external sensors 470. Using the present invention, the power management block is capable of turning off or turning on a sensor based on other sensor inputs or input from the application processor (AP). The power management block is further capable of putting the device or processors in a low power mode based on the one or more sensors. The power management block is further capable of applying a low power mode to one or more sensors based on one or more other sensors. [0060] For example, when a gesture is recognized by a processor, the power management block is capable of turning on the microphone. In another example, when the ambient light sensor senses low light in an environment and the sensed low light situation is then combined with accelerometer measurements, the device may be set or otherwise configured for sleep mode.
[0061] From FIG.3, the memories 452a-452d can store raw sensor data from sensors, including those of the external sensors. The motion data or processed data is also stored in the memory. It will be appreciated that the present invention is not limited to particular memory configurations or types such that memories 452a-452d as used herein can include single port or multiport SRAM, DRAM or FIFO, for instance. In other embodiments, a first memory can reside in ISS 405 outside sensor hub 450 in addition to memories 452a-452d to store any of sensor data, motion data and instructions. In yet other embodiments, a second memory can reside external to ISS 405 to store sensor data, motion data and instructions.
[0062] The controller block 454 of FIG.3 includes control logic for the sensor hub 450. The controller block, also includes a bus master. The bus master, not pictured, manages the data storage from sensors and also provides for the storing of data from the processors.
[0063] In a further embodiment the sensor hub of the present invention is capable of receiving measurements from more than one sensor to determine the "context" of the user’s actions. In the embodiment, the sensor hub is capable of then interpreting gestures or actions according to the context. [0064] Context can be established in a variety of ways. In one example, location can establish the context. The context could be established based on the way a system is connected (GPS, local Wi-Fi etc.) of the device to be controlled are connected.
[0065] A state of the device to be controlled establishes the context. For example, if a device that includes the ISS has browser page open, this could for example mean a context to enable "air-mouse" type functionality on a wearable device is established. This state could be as simple as the device being turned on.
[0066] In other aspects, a system and method in accordance with the present invention can be implemented in varying contexts and situations. For instance, in a preferred embodiment, a location defined the context for the operation of the ISS of the present invention. In such a situation, the implementation could be based on inertial sensors providing location information or the way in which the system is connected (such as with localized WI-FI or via another connection method) where all the devices to be controlled are connected similarly, irrespective of the WI-FI source, etc.
[0067] Still, in other aspects, an implementation could be based on the state of the device to be controlled as defining the context. For example, in an implementation involving a television having a browser page open, a context to enable "air-mouse" type functionality on the wearable device could be established. In such an implementation, the state could simply be the device being turned ON or OFF.
[0068] Still, in other aspects, an implementation could be based on time as defining the context. For example, in an implementation involving a determination as to whether it is day or night to enable a light on/off functionality. [0069] Further, in other aspects, an implementation could be based on proximity as defining the context. For example, in an implementation an ISS providing information about proximity to a device could be used as context.
[0070] Additionally, in other aspects, an implementation could be based on a picture of the device to be controlled as defining the context. For example, in an implementation of such a picture of the device could be a used as a context such as in the situation where the wearable device takes the form of computer-based glasses for instance.
[0071] Still, in other aspects, an implementation could be based on a device being turned ON or OFF as defining the context. For example, in an implementation involving a device turning ON (one sensor), such could further be associated with a proximity to the device (another sensor).
[0072] Still, in other aspects, an implementation could be based on a device being activated by another independent act as defining the context. For example, in an implementation involving a phone ringing, as such is triggered by a calling in to a line from the act of another, such could further be associated with lowering volumes or turning off those associated remote devices that are active at the time of the phone ringing.
[0073] Further, in other aspects, an implementation could be based on being able to access a device’s actuation as defining the context. For example, in an implementation involving a garage door, even in the event where a car within the garage is being stolen, the thief is unable to open the garage door absent having control over a device that includes an ISS which enables the door to open or close. [0074] Further, in other aspects, an implementation could be based on a user’s situation as defining the context. For example, in an implementation involving a user sleeping, under such a context, the sensors of the ISS could establish Turn-off/Turn-on features on one or more remote devices (e.g., auto alarm the house, control thermostat, CO-Alarm, smoke detector, etc.).
[0075] Still further, in other aspects, an implementation could be based on a context of a social gathering at a predetermined location. For example, in an implementation involving a social event having a series of predetermined timed events where each event has multiple remote devices engaged to be activated to perform a function (e.g., streamers release, music, lights, microphone, etc.), each remote device is configured to be active only during pre-set periods and each device is also configured to recognize and receive specific commands from gestures or movements from a device that includes the ISS. In such a situation, a user can control certain of the remote device independent from another and other dependent with one another, without manually resetting or engaging others at additional costs to operate the event.
[0076] By utilizing different types of sensors more information can be provided to obtain the proper context. Hence, depending upon the situation there may be different levels of importance for different types of situations. For example, if there is a meeting, that has a person has remote access to the primary sensors may be motion sensors that allow the user to know a person has entered the room that information may engage a video camera and a microphone at the remote location that allows the user to see and communicate with who has entered. In another example, additional sensors may be used to provide information about which room is being utilized for the meeting as well as the identity of all the attendees to provide more context. The above description is by way of example only and one of ordinary skill in the art recognizes that any or any combination of sensors can provide context information and generally the more different types of sensors that are available will improve the context for a user. The sensors in the ISS along with the algorithm in the memory can detect basic units such as a velocity, acceleration, gravity, elevation, environmental motion/vibrations, background noise, audio signature, detecting keywords, images, video, motion gestures, image gesture, ambient light, body temperature, ambient temperature, humidity, rotation, orientation, heading, ambient pressure, air quality, and flat tire detection. The air quality can be the amount of oxygen (O2), carbon dioxide (Co2) or a particle count.
[0077] In one embodiment, this present invention relates to system architectures configured to process sensor data hierarchically. Two or more processing levels may be provided so that sensor data processed at a lower level may be output to an upper level for further processing or other operation involving the processed data from the lower level. At least one processor is provided at each processing level and, as desired, may be implemented as an ISS comprising one or more embedded sensors, as a processor receiving inputs from external sensor sources or as any other processor and sensor configuration. As will be described below, such an architecture may facilitate efficient communication among the sensors for improved high-level features, such as interpreting gestures or actions according to the context.
[0078] To help illustrate aspects of this disclosure, FIG. 4 schematically depicts an exemplary diagram of a system 500 configured to process sensor data hierarchically. As shown, system 500 may involve at least two hierarchical processing levels, such as first processing level 502 and second processing level 504. In this embodiment, first processing level 502 features sensor hub 506, which has processor 508, memory 510 and at least one embedded sensor 512. As described above, it may be desirable to organize sensors that measure related conditions by grouping them into a sensor hub. First processing level 502 also includes sensor hub 514 having processor 516 and memory 518 receiving input from an external sensor 520. Accordingly, processors at processing level 502 may receive sensor data input from any suitable source, such as an embedded internal sensor 512 or an external sensor 520. Each processor 508 and 516 may independently perform one or more operations on the received sensor data. As will be appreciated, a variety of operations may be performed on the sensor data, including aggregation, sensor fusion, gesture recognition and other suitable algorithms for processing sensor data. Although first processing level 502 is shown as receiving raw sensor data, such as from internal sensor 512 or external sensor 520, one or more processors at first processing level 502 processed sensor data may receive processed sensor data from a lower hierarchical level.
[0079] Sensor hubs 506 and 514 may be configured to output processed sensor data to second processing level 504 after performing one or more operations with processors 508 and 516, respectively. In this embodiment, second processing level 504 includes sensor hub 522 having processor 524 and memory 526 to receive the processed sensor data output from first processing level 502. Processor 524 may perform one or more operations on the output processed sensor data, such as those described above. Raw sensor data may also be received for processing at second processing level 504. As shown, sensor hub 522 includes embedded sensor 528, which may output data to processor 524. Raw sensor data may also be provided to second processing level 504 from sensor hub 530 having memory 532 to aggregate data from embedded sensor 534 or other externally implemented sensor. In addition to receiving processed sensor data from first processing level 502, second processing level 504 may also receive processed sensor data from a different hierarchical level.
[0080] Second processing level 504 may be configured to output processed sensor data from processor 524 to third processing level 536, which in this embodiment includes application processor 538. In some embodiments, third processing level 536 may represent the top of the hierarchy, but additional processing levels may be provided as desired. Processed sensor data output by second processing level 504 at least includes the results of processor 524 performing one or more operations on data received from first processing level 502, but may also include the results of processor 524 performing one or more operations on raw sensor data, such as received from sensor 528 or sensor hub 530, or on processed sensor data received from a different hierarchical level.
[0081] In one aspect, the one or more operations performed at each processing level may be considered to increase the amount of information represented by each data bit or otherwise condense the data received from a lower hierarchical processing level. As a representative example and without limitation, first processing level 502 may receive raw motion sensor data, such as from embedded sensor 512 that may include a gyroscope and/or an accelerometer. Processor 508 may be configured to recognize a pattern of raw motion sensor data corresponding to a specific context as describe above, such as one step of a user’s stride in a pedometer application. Consequently, processor 508 may output information to second processing level 504 each time a step is recognized. One of skill in the art will appreciate that relatively few bits may be used to indicate a step as compared to the number of bits corresponding to the motion sensor data used to recognize the step. Likewise, the processed sensor data (e.g., each step) received by second processing level 504 may be further processed, such as by using the number of steps to determine velocity or distance in a fitness tracking application, or combined with other sources of sensor data, such as heading information in a dead reckoning navigational application. In some embodiments, each processing level may therefore increase the information density of the data bits used at each level.
[0082] As described above, the techniques of this disclosure may be applied to perform power management operations with respect to various components of the system, including one or more sensors or sensor sets and/or processors. As desired, the implementation of power management may be performed with respect to each component individually and/or independently of other components. Notably, it may be desirable to operate one or more components at one processing level in a power save or low power mode and to operate one or more components at another processing level in an active or full power mode.
[0083] In one aspect, a power mode at one processing level may be changed depending on an operation performed at another processing level. For example, sensors and/or processors at first processing level 502 may be operated at a reduced power level, outputting a reduced set of sensor data until triggered by an operation occurring at second processing level 504, such as recognition of a gesture or other context. Upon occurrence of such a trigger, sensors and/or processors at first processing level 502 may be fully activated to output a full set of sensor data. Similarly, second processing level 504 may be configured trigger a reduction in power at first processing level 502, such as after identify a suitable period of inactivity or cessation of a current context. Further, a lower hierarchical processing level may also implement a power management change at an upper processing level. For example, first processing level 502 may be configured to recognize a gesture using raw sensor data received at that level and correspondingly activating or deactivating components at second processing level 504 or another hierarchical level. In general, any operation occurring at one processing level may be used as a trigger to initiate an action occurring at another processing level.
[0084] System 500 may be implemented as a single device as desired or any number of processing levels may be individually implemented by discrete device that communicate with one another. Thus, system 500 may be a self-contained device such as a mobile phone (e.g., cellular phone, a phone running on a local network, or any other telephone handset), personal digital assistant (PDA), tablet, video game player, video game controller, navigation device, mobile internet device (MID), personal navigation device (PND), digital still camera, digital video camera, binoculars, telephoto lens, portable music, video, or media player, remote control, or other handheld device, or a combination of one or more of these devices. In other embodiments, system 500 may include a plurality of devices, such as one of those noted above, or a non-portable device such as a desktop computer, electronic tabletop device, server computer, etc., any of which can communicate with each other using any suitable method, e.g., via any type of wire-based communication protocol (e.g., serial transmissions, parallel transmissions, packet-based data communications), wireless connection (e.g., electromagnetic radiation, infrared radiation or other wireless technology), or a combination of one or more wired connections and one or more wireless connections.
[0085] Accordingly, FIG. 5 schematically represents an embodiment of the disclosure in which processing levels are implemented in separate devices. For purposes of illustration and not limitation, first processing level 502 may be implemented in wrist band 550, second processing level 504 may be implemented in smart phone 552 and third processing level may be implemented in server 554. In the context of the pedometer example described above, wrist band 550 may include external or embedded motion sensors that output raw gyroscope and accelerometer data. First processing level 502 may be configured to recognize a pattern of the raw motion data as corresponding to a step. In turn, wrist band 550 may then output to smart phone 552 a condensation of the raw motion sensor data in the form of indicating the user has taken a step. Correspondingly, second processing level 504 may utilize information about the steps by aggregating data from first processing level 502 and performing further operation, such as computing distance, velocity or the like. Smart phone 552 may then output the further processed data to server 554, such as for fitness tracking or navigation.
[0086] To help illustrate aspects of this disclosure, FIG. 6 depicts a flowchart showing a process for processing sensor data hierarchically. Beginning with 600, sensor data may be received at a first processing level. As described above, sensor data received at the first processing level may be raw sensor data, such as from an embedded sensor or an external sensor, or may be sensor data processed at a lower hierarchical level. One or more operations may be performed on the received sensor data by the first processing level as indicated by 602. The first processing level then outputs the processed sensor data to a second processing level in 604. Next, in 606, the second processing level performs one or more operations on the processed sensor data and in 608, outputs the result to a third processing level.
[0087] Although the present invention has been described in accordance with the embodiments shown, one of ordinary skill in the art will readily recognize that there could be variations to the embodiments and those variations would be within the spirit and scope of the present invention. Accordingly, many modifications may be made by one of ordinary skill in the art without departing from the spirit and scope of the present invention.

Claims

CLAIMS What is claimed is:
1. A method for processing sensor data hierarchically, comprising:
receiving sensor data input at a first processing level;
performing a first operation on the received sensor data input at the first processing level;
outputting processed sensor data from the first processing level to a second processing level;
performing a second operation on the processed sensor data at the second processing level; and
outputting a result from the second operation to a third processing level.
2. The method of claim 1, wherein receiving sensor data input at the first processing level comprises receiving sensor data from at least one embedded sensor that is integrated with a processor.
3. The method of claim 1, wherein receiving sensor data input at the first processing level comprises receiving sensor data from at least one embedded sensor that is integrated with memory.
4. The method of claim 1, wherein receiving sensor data input at the first processing level comprises receiving raw sensor data from at least one external sensor.
5. The method of claim 1, wherein receiving sensor data input at the first processing level comprises receiving processed sensor data from another hierarchical processing level.
6. The method of claim 1, further comprising providing a plurality of independent processors at the first processing level, wherein each processor is configured to perform an operation on received sensor data.
7. The method of claim 1, further comprising receiving sensor data input at the second processing level from a least one embedded sensor that is integrated with a processor.
8. The method of claim 1, wherein at least one of the first operation and the second operation comprises aggregating sensor data.
9. The method of claim 1, wherein at least one of the first operation and the second operation comprises processing sensor data to increase an amount of information per bit.
10. The method of claim 9, wherein an amount of information per bit is increased at the first and the second processing levels.
11. The method of claim 1, further comprising receiving sensor data input at the second processing level from an external sensor.
12. The method of claim 1, further comprising providing a plurality of independent processors at the second processing level, wherein each processor is configured to perform an operation on processed sensor data output by another hierarchical processing level.
13. The method of claim 1, further comprising independently implementing power management at the first and second processing levels.
14. The method of claim 13, further comprising changing a power mode of one processing level based, at least in part, on a result of an operation at another processing level.
15. The method of claim 14, further comprising transitioning one processing level between a power save mode and an active mode based, at least in part, on an operation performed at another processing level.
16. The method of claim 1, further comprising triggering an action at one processing level based, at least in part, on an operation performed at another processing level.
17. The method of claim 1, wherein the sensor data input received at one processing level comprises data from a set of sensors.
18. A system for processing sensor data, comprising:
at least one processor of a first processing level configured to receive sensor data input and perform a first operation on the received sensor data;
at least one processor of a second processing level configured to receive processed sensor data from the first processing level and perform a second operation on the processed sensor data; and
at least one processor of a third processing level configured to receive a result of the second operation from the second processing level.
19. The system of claim 18, further comprising at least one embedded sensor that is integrated with a processor of the first processing level providing the sensor data input.
20. The system of claim 18, further comprising at least one embedded sensor that is integrated with memory providing the sensor data input.
21. The system of claim 18, further comprising at least one external sensor providing the sensor data input.
22. The system of claim 18, further comprising another hierarchical level providing the sensor data input.
23. The system of claim 18, further comprising a plurality of independent processors of the first processing level, wherein each processor is configured to perform an operation on received sensor data.
24. The system of claim 18, further comprising at least one embedded sensor that is integrated with a processor of the second processing level and is configured to output sensor data to the second processing level.
25. The system of claim 18, wherein at least one of the first operation and the second operation comprises aggregating sensor data.
26. The system of claim 18, wherein at least one of the first operation and the second operation comprises processing sensor data to increase an amount of information per bit.
27. The system of claim 9, wherein the first and the second processing levels are configured to increase an amount of information per bit.
28. The system of claim 18, further comprising at least one external sensor that is configured to output sensor data to the second processing level.
29. The system of claim 18, further comprising a plurality of independent processors at the second processing level, wherein each processor is configured to perform an operation on processed sensor data output by another hierarchical processing level.
30. The system of claim 18, further comprising a power management block configured to independently control the first and second processing levels.
31. The system of claim 13, wherein the power management block is configured to change a power mode of one processing level based, at least in part, on a result of an operation at another processing level.
32. The system of claim 31, wherein the power management block is configured to transition one processing level between a power save mode and an active mode based, at least in part, on an operation performed at another processing level.
33. The system of claim 18, wherein one processing level is configured to perform an action based, at least in part, on an operation performed at another processing level.
34. The system of claim 18, wherein the sensor data input received at one processing level comprises data from a set of sensors.
.
PCT/US2015/047641 2014-09-08 2015-08-31 System and method for hierarchical sensor processing WO2016040018A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US14/480,364 US20150321903A1 (en) 2013-03-15 2014-09-08 System and method for hierarchical sensor processing
US14/480,364 2014-09-08

Publications (1)

Publication Number Publication Date
WO2016040018A1 true WO2016040018A1 (en) 2016-03-17

Family

ID=54207671

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/047641 WO2016040018A1 (en) 2014-09-08 2015-08-31 System and method for hierarchical sensor processing

Country Status (1)

Country Link
WO (1) WO2016040018A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0444801A1 (en) * 1990-02-27 1991-09-04 Atlantic Richfield Company A hierarchical process control system and method
US5907559A (en) * 1995-11-09 1999-05-25 The United States Of America As Represented By The Secretary Of Agriculture Communications system having a tree structure
US6892575B2 (en) 2003-10-20 2005-05-17 Invensense Inc. X-Y axis dual-mass tuning fork gyroscope with vertically integrated electronics and wafer-scale hermetic packaging
US7104129B2 (en) 2004-02-02 2006-09-12 Invensense Inc. Vertically integrated MEMS structure with electronics in a hermetically sealed cavity
US7219156B1 (en) * 1999-09-29 2007-05-15 Silicon Graphics, Inc. System and method for a hierarchical system management architecture of a highly scalable computing system
US20110307433A1 (en) * 2010-06-10 2011-12-15 Paul Dlugosch Programmable device, heirarchical parallel machines, methods for providing state information
US8250921B2 (en) 2007-07-06 2012-08-28 Invensense, Inc. Integrated motion processing unit (MPU) with MEMS inertial sensing and embedded digital electronics
US20140260704A1 (en) * 2013-03-15 2014-09-18 Invensense, Inc. Device and system for integrated sensor system (iss)

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0444801A1 (en) * 1990-02-27 1991-09-04 Atlantic Richfield Company A hierarchical process control system and method
US5907559A (en) * 1995-11-09 1999-05-25 The United States Of America As Represented By The Secretary Of Agriculture Communications system having a tree structure
US7219156B1 (en) * 1999-09-29 2007-05-15 Silicon Graphics, Inc. System and method for a hierarchical system management architecture of a highly scalable computing system
US6892575B2 (en) 2003-10-20 2005-05-17 Invensense Inc. X-Y axis dual-mass tuning fork gyroscope with vertically integrated electronics and wafer-scale hermetic packaging
US7104129B2 (en) 2004-02-02 2006-09-12 Invensense Inc. Vertically integrated MEMS structure with electronics in a hermetically sealed cavity
US8250921B2 (en) 2007-07-06 2012-08-28 Invensense, Inc. Integrated motion processing unit (MPU) with MEMS inertial sensing and embedded digital electronics
US20110307433A1 (en) * 2010-06-10 2011-12-15 Paul Dlugosch Programmable device, heirarchical parallel machines, methods for providing state information
US20140260704A1 (en) * 2013-03-15 2014-09-18 Invensense, Inc. Device and system for integrated sensor system (iss)

Similar Documents

Publication Publication Date Title
US20180265350A1 (en) Device and system for integrated sensor system (iss)
US10327082B2 (en) Location based tracking using a wireless earpiece device, system, and method
US9726498B2 (en) Combining monitoring sensor measurements and system signals to determine device context
US9664660B2 (en) Air sensor with air flow control
JP5937076B2 (en) Method and apparatus for gesture-based user input detection in a mobile device
US10504031B2 (en) Method and apparatus for determining probabilistic context awareness of a mobile device user using a single sensor and/or multi-sensor data fusion
US20170186446A1 (en) Mouth proximity detection
US9620000B2 (en) Wearable system and method for balancing recognition accuracy and power consumption
US20140244209A1 (en) Systems and Methods for Activity Recognition Training
JP6640249B2 (en) Technique for input gesture control of wearable computing device based on fine movement
US9179266B2 (en) Augmentation of indoor navigation methods and apparatus with map matching constraints
US10239750B2 (en) Inferring ambient atmospheric temperature
US10721347B2 (en) Detecting patterns and behavior to prevent a mobile terminal drop event
Gajjar Mobile sensors and context-aware computing
EP2973555A1 (en) Method and apparatus for pre-processing audio signals
US20150321903A1 (en) System and method for hierarchical sensor processing
CN106664492A (en) Smart sensor for always-on operation
US20210096637A1 (en) Blow event detection and mode switching with an electronic device
EP4310636A1 (en) Gesture interaction method, system, and apparatus
WO2016040018A1 (en) System and method for hierarchical sensor processing
CN111183336B (en) Information processing apparatus, information processing method, and computer-readable storage medium
US20160134237A1 (en) Low power oscillator system
WO2018020792A1 (en) Information processing apparatus, information processing method, and program
US20190169018A1 (en) Stress isolation frame for a sensor
KR102081966B1 (en) Apparatus for motion recognition based on context awareness and storage medium therefor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15771749

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15771749

Country of ref document: EP

Kind code of ref document: A1