US20110157327A1 - 3d audio delivery accompanying 3d display supported by viewer/listener position and orientation tracking - Google Patents

3d audio delivery accompanying 3d display supported by viewer/listener position and orientation tracking Download PDF

Info

Publication number
US20110157327A1
US20110157327A1 US12/982,377 US98237710A US2011157327A1 US 20110157327 A1 US20110157327 A1 US 20110157327A1 US 98237710 A US98237710 A US 98237710A US 2011157327 A1 US2011157327 A1 US 2011157327A1
Authority
US
United States
Prior art keywords
viewer
audio
circuitry
orientation
reference information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/982,377
Inventor
Nambirajan Seshadri
Jeyhan Karaoguz
James D. Bennett
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Avago Technologies International Sales Pte Ltd
Original Assignee
Broadcom Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Broadcom Corp filed Critical Broadcom Corp
Priority to US12/982,377 priority Critical patent/US20110157327A1/en
Assigned to BROADCOM CORPORATION reassignment BROADCOM CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KARAOGUZ, JEYHAN, SESHADRI, NAMBIRAJAN, BENNETT, JAMES D.
Publication of US20110157327A1 publication Critical patent/US20110157327A1/en
Assigned to BANK OF AMERICA, N.A., AS COLLATERAL AGENT reassignment BANK OF AMERICA, N.A., AS COLLATERAL AGENT PATENT SECURITY AGREEMENT Assignors: BROADCOM CORPORATION
Assigned to AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. reassignment AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BROADCOM CORPORATION
Assigned to BROADCOM CORPORATION reassignment BROADCOM CORPORATION TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS Assignors: BANK OF AMERICA, N.A., AS COLLATERAL AGENT
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/361Reproducing mixed stereoscopic images; Reproducing mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/18Stereoscopic photography by simultaneous viewing
    • G03B35/24Stereoscopic photography by simultaneous viewing using apertured or refractive resolving means on screens or between screen and eye
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/003Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/139Format conversion, e.g. of frame-rate or size
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/161Encoding, multiplexing or demultiplexing different image signal components
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/189Recording image signals; Reproducing recorded image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/305Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/31Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/31Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
    • H04N13/312Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers the parallax barriers being placed behind the display panel, e.g. between backlight and spatial light modulator [SLM]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/31Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
    • H04N13/315Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers the parallax barriers being time-variant
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/349Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking
    • H04N13/351Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking for displaying simultaneously
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/356Image reproducers having separate monoscopic and stereoscopic modes
    • H04N13/359Switching between monoscopic and stereoscopic modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/435Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S7/00Indicating arrangements; Control arrangements, e.g. balance control
    • H04S7/30Control circuits for electronic adaptation of the sound field
    • H04S7/302Electronic adaptation of stereophonic sound system to listener position or orientation
    • H04S7/303Tracking of listener position or orientation
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B6/00Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2300/00Aspects of the constitution of display devices
    • G09G2300/02Composition of display devices
    • G09G2300/023Display panel composed of stacked panels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/028Improving the quality of display appearance by changing the viewing angle properties, e.g. widening the viewing angle, adapting the viewing angle to the view direction
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/04Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/003Details of a display terminal, the details relating to the control arrangement of the display terminal and to the interfaces thereto
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N2013/40Privacy aspects, i.e. devices showing different images to different viewers, the images not being viewpoints of the same scene
    • H04N2013/403Privacy aspects, i.e. devices showing different images to different viewers, the images not being viewpoints of the same scene the images being monoscopic
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N2013/40Privacy aspects, i.e. devices showing different images to different viewers, the images not being viewpoints of the same scene
    • H04N2013/405Privacy aspects, i.e. devices showing different images to different viewers, the images not being viewpoints of the same scene the images being stereoscopic or three dimensional

Definitions

  • the present invention generally relates to techniques for delivering audio to a listener based on an orientation of the listener.
  • Images may be generated for display in various forms.
  • television is a widely used telecommunication medium for transmitting and displaying images in monochromatic (“black and white”) or color form.
  • images are provided in analog form and are displayed by display devices in two-dimensions.
  • images are being provided in digital form for display in two-dimensions on display devices having improved resolution (e.g., “high definition” or “HD”).
  • HD high definition
  • Conventional displays may use a variety of techniques to achieve three-dimensional image viewing functionality.
  • various types of glasses have been developed that may be worn by users to view three-dimensional images displayed by a conventional display.
  • glasses include glasses that utilize color filters or polarized filters.
  • the lenses of the glasses pass two-dimensional images of differing perspective to the user's left and right eyes.
  • the images are combined in the visual center of the brain of the user to be perceived as a three-dimensional image.
  • synchronized left eye, right eye LCD (liquid crystal display) shutter glasses may be used with conventional two-dimensional displays to create a three-dimensional viewing illusion.
  • LCD display glasses are being used to display three-dimensional images to a user.
  • the lenses of the LCD display glasses include corresponding displays that provide images of differing perspective to the user's eyes, to be perceived by the user as three-dimensional.
  • a display device of this type may be switched to a three-dimensional mode for viewing of three-dimensional images, and may be switched to a two-dimensional mode for viewing of two-dimensional images (and/or to provide a respite from the viewing of three-dimensional images).
  • a parallax barrier is another example of a device that enables images to be displayed in three-dimensions.
  • a parallax barrier includes a layer of material with a series of precision slits. The parallax barrier is placed proximal to a display so that a user's eyes each see a different set of pixels to create a sense of depth through parallax.
  • a disadvantage of parallax barriers is that the viewer must be positioned in a well-defined location in order to experience the three-dimensional effect. If the viewer moves his/her eyes away from this “sweet spot,” image flipping and/or exacerbation of the eyestrain, headaches and nausea that may be associated with prolonged three-dimensional image viewing may result.
  • Conventional three-dimensional displays that utilize parallax barriers are also constrained in that the displays must be entirely in a two-dimensional image mode or a three-dimensional image mode at any time.
  • Surround sound provides different audio content through different audio channels in an effort to provide a fixed or forward perspective of a sound field to a viewer/listener at a fixed location (e.g., the aforementioned “sweet spot”).
  • the audio channels correspond to locations of speakers that surround the viewer (e.g., right, left, front, back, etc.).
  • surround sound has its limitations, especially when applied with respect to three-dimensional display devices. For example, if a viewer moves away from the fixed location, a degradation of the viewer's audio experience may result.
  • the sound that is projected from a speaker may be perceived as becoming louder as the viewer nears that speaker.
  • the sound that is projected from that speaker may be perceived as becoming quieter as the viewer moves away from the speaker.
  • a viewer-specific audio device e.g., headphones or earbuds
  • movements by the viewer may have no effect on the viewer's perception of the sounds that are projected from speakers in the viewer-specific device.
  • FIG. 1A is a block diagram of a media system in accordance with an embodiment.
  • FIG. 1B shows an exemplary content capturing system in accordance with an embodiment.
  • FIG. 2 shows an exemplary viewer-located implementation of an audio system shown in FIG. 1 in accordance with an embodiment.
  • FIG. 3 is a block diagram of a media system in accordance with a first embodiment that includes reference information generation circuitry that implements a triangulation technique to determine an estimated location of a viewer.
  • FIG. 4 is a block diagram of a media system in accordance with a second embodiment that includes reference information generation circuitry that implements a triangulation technique to determine an estimated location of a viewer.
  • FIG. 5 is a block diagram of a media system in accordance with an embodiment that includes reference information generation circuitry that implements an infrared (IR) distance measurement system to help determine an estimated location of a viewer.
  • IR infrared
  • FIG. 6 is a block diagram of a media system in accordance with an embodiment that includes information generation circuitry that implements a magnetic field detection system to help determine an estimated location of viewer.
  • FIG. 7 is a block diagram of a media system in accordance with an embodiment that includes viewer-located reference information generation circuitry that includes one or more cameras and one or more microphones for facilitating the generation of reference information corresponding to at least one positional characteristic of a viewing reference of a viewer.
  • FIG. 8 is a block diagram of a media system in accordance with an embodiment that includes reference information generation circuitry that includes a head orientation sensor and eye tracking circuitry for determining a head orientation and point of gaze, respectively, of a viewer.
  • FIG. 9 is a block diagram of a media system in accordance with an embodiment in which non-viewer-located camera(s) and/or microphone(s) operate to generate reference information corresponding to at least one positional characteristic of a viewing reference of a viewer.
  • FIG. 10 depicts a headset in accordance with an embodiment that includes reference information generation circuitry for facilitating the generation of reference information corresponding to at least one positional characteristic of a viewing reference of a viewer.
  • FIG. 11A depicts an embodiment in which reference information generation circuitry is distributed among a headset and a remote control that are connected to each other by a wired communication link.
  • FIG. 11B depicts an embodiment in which reference information generation circuitry is distributed among a headset and a laptop computer that are connected to each other by a wireless communication link.
  • FIG. 12 depicts a flowchart of a method for presenting three-dimensional content to a viewer having a viewing reference in accordance with an embodiment, wherein the manner in which such content is presented is controlled in accordance with reference information concerning the viewing reference.
  • FIG. 13 depicts a flowchart of a method for delivering video output and audio output to a viewer based at least in part on positional characteristic(s) relating to an orientation of the viewer in accordance with an embodiment.
  • FIG. 14 depicts a flowchart of a method for delivering an audio experience for ears of a listener via a plurality of speakers in accordance with an embodiment.
  • FIG. 15 is a block diagram of a media system in accordance with an embodiment that simultaneously presents first three-dimensional content to a first viewer having a first viewing reference and second three-dimensional content to a second viewer having a second viewing reference, wherein the manner in which such content is displayed is controlled in accordance with reference information concerning the first and second viewing references.
  • FIG. 16 depicts a flowchart of a method for presenting first three-dimensional content to a first viewer having a first viewing reference and simultaneously presenting second three-dimensional content to a second viewer having a second reference in accordance with an embodiment, wherein the manner in which such content is presented is controlled in accordance with reference information concerning the first and second viewing references.
  • FIG. 17 depicts a flowchart of a method for delivering audio content to first and second viewers of a display capable of simultaneously presenting first video content to the first viewer and second video content to the second viewer in accordance with an embodiment.
  • FIG. 18 is a block diagram of an example implementation of an adaptable two-dimensional/three-dimensional media system in accordance with an embodiment.
  • references in the specification to “one embodiment,” “an embodiment,” “an example embodiment,” etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to effect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
  • FIG. 1 is a block diagram of a media system 102 that presents three-dimensional content to a viewer 106 having a viewing reference 108 (i.e., an orientation) in accordance with an embodiment.
  • media system 102 includes an adaptable screen assembly 122 , driver circuitry 124 , processing circuitry 126 , reference information generation circuitry 110 a, and reference information generation circuitry 110 b.
  • media system 102 operates to deliver light that includes one or more viewable images to a viewing area that includes viewer 106 .
  • Media system 102 also operates to deliver audio content that is associated with the one or more viewable images toward viewer 106 .
  • Media system 102 may include, for example and without limitation, a television, a projection system, a home theater system, a monitor, a computing device (e.g., desktop computer, laptop computer, tablet computer) or a handheld device (e.g., a cellular phone, smart phone, personal media player, personal digital assistant), wherein the computing device or handheld device has at least one attached or integrated display.
  • Adaptable screen assembly 122 is designed such that certain display characteristics associated therewith can be modified to support multiple viewing modes. For example, certain display characteristics associated with adaptable screen assembly 122 may be modified to selectively present images in a two-dimensional viewing mode or one or more three-dimensional viewing modes.
  • display characteristics associated with screen assembly 122 may be modified to display a single image of certain subject matter to provide a two-dimensional view thereof, to display two images of the same subject matter viewed from different perspectives in a manner that provides a single three-dimensional view thereof, or to display a multiple of two images (e.g., four images, eight images, etc.) of the same subject matter viewed from different perspectives in a manner that simultaneously provides multiple three-dimensional views thereof, wherein the particular three-dimensional view perceived by a viewer is dependent at least in part upon the position of the viewer (also referred to herein as a “multi-view three-dimensional viewing mode”).
  • adaptable screen assembly 122 may also be capable of simultaneously presenting two dimensional views and three-dimensional views in different regions of the same screen, respectively.
  • adaptable screen assembly 122 may be capable of simultaneously presenting a two-dimensional view of first visual content in a first region of a screen, and one or more three-dimensional views of second visual content in a second region of the screen.
  • Adaptable screen assemblies having such capabilities are described in commonly-owned, co-pending U.S. patent application Ser. No. 12/845,440, filed on Jul. 28, 2010, and entitled “Adaptable Parallax Barrier Supporting Mixed 2D and Stereoscopic 3D Display Regions,” the entirety of which is incorporated by reference herein.
  • a display characteristic of adaptable screen assembly 122 that may be modified to switch between different full-screen and regional two-dimensional and three-dimensional viewing modes may include a configuration of an adaptable light manipulator such as an adaptable parallax barrier.
  • An adaptable lenticular lens may also be used as an adaptable light manipulator to switch between different full-screen three-dimensional viewing modes. Descriptions of such adaptable light manipulators and methods for dynamically modifying the same may be found in the aforementioned, incorporated U.S. patent application Ser. No. 12/774,307, filed on May 5, 2010 and entitled “Display with Elastic Light Manipulator” and U.S. patent application Ser. No. 12/845,409, filed on Jul.
  • the degree of stretching of an adaptable lenticular lens may be modified in order to support certain three-dimensional viewing modes.
  • barrier elements of an adaptable parallax barrier may be selectively placed in a blocking or non-blocking state in order to support certain full-screen and regional two-dimensional and three-dimensional viewing modes.
  • adaptable screen assembly 122 that may be modified to switch between different full-screen and regional two-dimensional and three-dimensional viewing modes may include the manner in which image content is mapped to display pixels of a pixel array, as described in commonly-owned, co-pending U.S. patent application Ser. No. 12/774,225, filed on May 5, 2010 and entitled “Controlling a Pixel Array to Support an Adaptable Light Manipulator,” the entirety of which is incorporated by reference herein.
  • Yet another display characteristic that may be modified to achieve such switching includes the manner in which backlighting is generated by a backlighting array or other non-uniform light generation element, as described in commonly-owned, co-pending U.S. patent application Ser. No. ______ (Attorney Docket. No. A05.01210000), filed on even date herewith and entitled “Backlighting Array Supporting Adaptable Parallax Barrier,” the entirety of which is incorporated by reference herein.
  • the adaptation of the display characteristics of adaptable screen assembly 122 may be carried out, in part, by sending coordinated drive signals to various elements (e.g., a non-uniform backlight generator, a pixel array and/or an adaptable light manipulator) that are included in adaptable screen assembly 122 .
  • This function is performed by driver circuitry 124 responsive to the receipt of control signals from processing circuitry 126 .
  • a manner in which such coordinated drive signals may be generated is described in U.S. patent application Ser. No. ______ (Attorney Docket No. A05.01240000), filed on even date herewith and entitled “Coordinated Driving of Adaptable Light Manipulator, Backlighting and Pixel Array in Support of Adaptable 2D and 3D Displays.”
  • Audio system 128 includes a plurality of speakers 130 a - 130 g that provide respective portions 132 a - 132 g of audio output based on underlying audio content that is associated with the visual output based on underlying video content that adaptable screen assembly 122 delivers to viewer 106 . Audio system 128 provides the portions 132 a - 132 g in accordance with a spatial orientation of the audio content that is controlled by driver circuitry 124 .
  • a spatial orientation of audio content is a configuration of the audio content in which characteristics of respective portions (e.g., portions 132 a - 132 g ) of the audio content, which correspond to respective speakers (e.g., speakers 130 a - 130 g ), indicate an orientation of ears of a viewer/listener (e.g., viewer 106 ) with respect to sound source(s) that correspond to (e.g., are depicted in) a three-dimensional view.
  • Such a characteristic may include an amplitude of sound that corresponds to a sound source and/or delays that are associated with the sound that corresponds to the sound source and associated reflections thereof.
  • adaptable screen assembly 122 may deliver a three-dimensional view to viewer 106 .
  • Driver circuitry 124 and speakers 130 a - 130 g may operate in conjunction to provide portions 132 a - 132 g of audio content that are associated with the three-dimensional view.
  • driver circuitry 124 may modify a spatial orientation of the audio content to take into consideration a change in orientation of viewer 106 with respect to sound sources that correspond to (e.g., are depicted in) the three-dimensional view and/or changes in orientation of such sound sources with respect to viewer 106 .
  • Driver circuitry 124 modifies the spatial orientation of the audio content responsive to the receipt of control signals from processing circuitry 126 .
  • the audio content may include a first audio portion 132 a that corresponds to a right side speaker 130 a and a second audio portion 132 b that corresponds to a left side speaker 130 b.
  • the spatial orientation of the audio content may be modified by decreasing an amplitude of a sound that corresponds to the sound source in the first audio portion 132 a and/or increasing a delay that is associated with the sound in the first audio portion 132 a.
  • the spatial orientation of the audio content may be modified by increasing an amplitude of the sound in the second audio portion 132 b and/or decreasing a delay that is associated with the sound in the second audio portion 132 b. Accordingly, the resulting spatial orientation of the audio content may provide an audio experience that is consistent with the viewer moving to the right of the sound source in a real-world environment.
  • a first and second object both objects being identical visually and audibly, might be presented in a three dimensional sensory environment via both the adaptable screen assembly 122 and the speakers 130 a - g.
  • the first object might be presented as being much closer to the viewer 106 than the second object.
  • the audio output between those of the speakers 130 a - g to the right of the viewer versus those to the left might provide first left side and first right side delays and amplitudes.
  • the audio output might provide second left side and second right side delays and amplitudes.
  • the first left and right delays and amplitudes associated with the first object will be adapted to produce a listening experience that corresponds to the visual change in reference and associated visual experience.
  • the second left and right delays and amplitudes will be adapted differently because of the further location in the 3D visual experience of the second object. That is, because it is to be perceived as being far away, the visual experience will not change as much or as rapidly as the first object appears to change visually.
  • the audio output portion corresponding to the second object (that producing the audible 3D listening experience associated with the second object) need not change as much or as fast either.
  • each of a plurality of audio/video objects has a 3D position and orientation that adapts both visually and audibly to attempt to convince a viewer-listener that the experience is a real-world environment.
  • Audio system 128 is shown in FIG. 1 to include seven speakers 130 a - 130 g for illustrative purposes and is not intended to be limiting. It will be recognized that audio system 128 may include any suitable number of speakers. Moreover, such speakers may be arranged in any suitable configuration. Audio system 128 may include viewer-specific speakers (e.g., speakers in headphones or earbuds) and/or viewer-agnostic speakers (e.g., speakers mounted on walls of a room or in a dashboard, door panels, etc. of a car). An example of a viewer-located implementation of audio system 128 that includes viewer-specific speakers is shown in FIG. 2 , which is described below.
  • viewer-specific speakers e.g., speakers in headphones or earbuds
  • viewer-agnostic speakers e.g., speakers mounted on walls of a room or in a dashboard, door panels, etc. of a car.
  • FIG. 2 An example of a viewer-located implementation of audio system 128 that includes viewer-specific speakers is
  • Reference information generation circuitry 110 a and 110 b comprise components of media system 102 that operate in conjunction to produce reference information concerning at least one positional characteristic of viewing and listening reference 108 (i.e., orientation) of viewer 106 with respect to adaptable screen assembly 122 .
  • Viewing and listening reference 108 may include any of a number of positional characteristics that affect how three-dimensional visual content displayed via adaptable screen assembly 122 and/or how audio content provided by audio system 128 will be perceived by viewer 106 .
  • Such positional characteristics may include, for example and without limitation, a position or location of viewer 106 relative to adaptable screen assembly 122 , a head orientation of viewer 106 , ear orientation of the viewer 106 , and/or a point of gaze of viewer 106 .
  • the position or location of viewer 106 (and both the eyes and ears thereof) relative to adaptable screen assembly 122 may include a distance from adaptable screen assembly 122 or some reference point associated therewith, and such distance may include both horizontal distance and elevation. Furthermore, the position or location of viewer 106 may also include eye and ear locations of viewer 106 .
  • the head orientation of viewer 106 may include a degree of tilt and/or rotation of the head of viewer 106 .
  • the reference information produced by reference information generation circuitry 110 a and 110 b is provided to control circuitry 124 .
  • processing circuitry 126 causes modification of at least one of the display characteristics of adaptable screen assembly 122 and/or modification of the spatial orientation of the audio content that is provided by audio system 128 .
  • Such modifications may be caused by causing appropriate drive signals to be generated by driver circuitry 124 .
  • Such modification may be performed, for example, to deliver a particular three-dimensional view to viewer 106 and/or tailored audio output based on underlying audio content associated therewith in accordance with one or more positional characteristics of viewing and listening reference 108 .
  • media system 102 is capable of delivering three-dimensional content and/or associated audio content to viewer 106 in an optimized manner.
  • Reference information generation circuitry 110 a is intended to represent viewer-located circuitry that is situated on or near viewer 106 .
  • reference information generation circuitry 110 a may comprise circuitry that is incorporated into one or more portable devices or housings which are worn on or carried by viewer 106 .
  • portable devices or housings may include, but are not limited to, a headset, glasses, an earplug, a pendant, a wrist-mounted device, a remote control, a game controller, a handheld personal device (such as a cellular telephone, smart phone, personal media player, personal digital assistant or the like) and a portable computing device (such as a laptop computer, tablet computer or the like).
  • Such viewer-located circuitry may be designed to leverage a proximity to the user to assist in generating the above-described reference information.
  • Reference information generation circuitry 110 b is intended to represent circuitry that is not viewer-located. As will be discussed in reference to particular embodiments described herein, reference information generation circuitry 110 b is configured to operate in conjunction with reference information generation circuitry 110 a to generate the above-described reference information and to provide such reference information to processing circuitry 126 .
  • Reference information generation circuitry 110 b is shown in FIG. 1A to include first stationary location support circuitry (SLSC) 134 a and second SLSC 134 b for illustrative purposes, though the embodiments are not limited in this respect.
  • First and second SLSCs 134 a and 134 b are configured to support production of the reference information described above.
  • first and second SLSCs 134 a and 134 b exchange (either send or receive based on the embodiment) with a viewer positioned device (VPD) to assist in generating location information (trilateration, triangulation, etc.) regarding viewer 106 .
  • VPD viewer positioned device
  • the VPD may be included in reference information generation circuitry 110 a.
  • At least a portion of the first and second SLSCs 134 a and 134 b may be within separate housings with a communication link back toward processing circuitry 126 .
  • only one SLSC (e.g., first SLSC 134 a or second SLSC 134 b ) housed with processing circuitry 126 might be used.
  • a SLSC may capture images of the viewing environment (perhaps even in infrared spectrum and with a lesser resolution camera) to support identification of viewer 106 and gathering of the viewer's associated location, eye/ear orientation or some other reference.
  • Further SLSCs can support more accurate gatherings of, or further, reference information.
  • the reference information may be based on sensor data within the viewer positioned devices (VPDs) as well, including a similar type of camera that captures a screen image and based thereon attempts to generate orientation and distance information.
  • a VPD transmits only, and two or more SLSCs (e.g., first and second SLSCs 134 a and 134 b ) receive only.
  • a VPD receives only, and two or more SLSCs transmit only.
  • two or more SLSCs transmit first and a VPD responds with time markers without (e.g., when accurate time synchronization between the SLSCs and the VPD exists) or with the SLSCs recording total round trip time and subtracting therefrom local turn-around time via marker info and the VPD (potentially unsynchronized) clocking.
  • a VPD transmits first and two or more SLSCs respond with time markers without or with the VPD recording total round trip time and subtracting therefrom local turn-around time via marker info and the SLSCs (potentially unsynchronized) clocking.
  • any of the techniques described above with reference to the first through fourth embodiments may be used, and underlying communication circuitry may support location determination and normal, unrelated communications.
  • any of the techniques described above with reference to the first through fifth embodiments may be used with the SLSCs (and/or/via processing circuitry 126 ) coordinating timing there amongst.
  • any of the techniques described above with reference to the first through sixth embodiments may be used, and the actual calculations based on such gathered info can be performed anywhere (e.g., in whole or in part in the VPD, one or more of the SLSCs, and/or processing circuitry 126 ).
  • adaptable screen assembly 122 driver circuitry 124 , processing circuitry 126 reference information generation circuitry 110 b, and optionally some portion of audio system 128 are all integrated within a single housing (e.g., a television or other display device).
  • adaptable screen assembly 122 , driver circuitry 124 , at least some portion of processing circuitry 126 , and optionally some portion of audio system 128 are integrated within a first housing (e.g., a television); reference information generation circuitry 110 b and optionally some portion of processing circuitry 126 are integrated within a second housing attached thereto (e.g., a set-top box, gateway device or media device); and at least a portion of audio system is integrated within one or more third housings (e.g., one or more stand-alone speaker assemblies) connected via wired or wireless connection(s) to driver circuitry 124 .
  • a first housing e.g., a television
  • reference information generation circuitry 110 b and optionally some portion of processing circuitry 126 are integrated within a second housing attached thereto (e.g., a set-top box, gateway device or media device)
  • at least a portion of audio system is integrated within one or more third housings (e.g., one or more stand-alone speaker assemblies)
  • first and second location support units coordinate their activities in interaction with the reference information generation circuitry 110 a to produce at least a portion of the reference information.
  • Such production may involve 2D/3D trilateration, 2D/3D triangulation, or other location processing to yield such portion of the reference information. Still other arrangements and distributions of this circuitry may be used.
  • FIG. 1B shows an exemplary content capturing system 100 in accordance with an embodiment.
  • content capturing system 100 includes a plurality of cameras 160 A- 160 D, a plurality of background microphones 162 A- 162 D, a plurality of first target microphones 154 A- 154 D, a plurality of second target microphones 156 A- 156 D, and a plurality of Nth target microphones 158 A- 158 D.
  • Cameras 160 A- 160 D are configured to simultaneously capture respective instances of content that represent respective camera views (a.k.a. perspectives) of common subject matter.
  • such subject matter may include a plurality of audio targets, such as first audio target 152 A, second audio target 152 B, and Nth audio target 152 N. Any two of the perspectives may be combined to provide a 3D viewing experience.
  • Background microphones 162 A- 162 D simultaneously capture instances of audio content that correspond to the respective instances of content that are captured by cameras 160 A- 160 D.
  • background microphones 162 A- 162 D may be placed proximate to (or attached to) respective cameras 160 A- 160 D.
  • First target microphones 154 A- 154 D are placed proximate to first audio target 152 A for capturing sounds that are provided by first audio target 152 A.
  • Second target microphones 156 A- 156 D are placed proximate to second audio target 152 B for capturing sounds that are provided by second audio target 152 B.
  • Nth target microphones 158 A- 158 D are placed proximate to Nth audio target 152 N for capturing sounds that are provided by Nth audio target 152 N.
  • Microphones 154 A, 156 A, and 158 A capture sounds of respective audio targets 152 A, 152 B, and 152 N that are associated with the instance of content that is captured by first camera 160 A.
  • Microphones 154 B, 156 B, and 158 B capture sounds of respective audio targets 152 A, 152 B, and 152 N that are associated with the instance of content that is captured by second camera 160 B.
  • Microphones 154 C, 156 C, and 158 C capture sounds of respective audio targets 152 A, 152 B, and 152 N that are associated with the instance of content that is captured by third camera 160 C.
  • Microphones 154 D, 156 D, and 158 D capture sounds of respective audio targets 152 A, 152 B, and 152 N that are associated with the instance of content that is captured by fourth camera 160 D.
  • audio that is captured by microphones 162 A, 154 A, 156 A, and 158 A may be combined to provide the instance of audio that is associated with the instance of content that is captured by first camera 160 A.
  • Audio that is captured by microphones 162 B, 154 B, 156 B, and 158 B may be combined to provide the instance of audio that is associated with the instance of content that is captured by second camera 160 B, and so on.
  • FIG. 1B Four cameras 160 A- 160 D, four background microphones 162 A- 162 D, four first target microphones 154 A- 154 D, four second target microphones 156 A- 156 D, and four Nth target microphones 158 A- 158 D are shown in FIG. 1B for illustrative purposes and are not intended to be limiting. It will be recognized that the configuration of FIG. 1B may include any suitable number of cameras, background microphones, and/or target microphones.
  • a sensory (auditory and/or visual) 3D experience can be captured as illustrated or produced conceptually based on the configuration of FIG. 1B .
  • processing circuitry 126 of FIG. 1A could receive only one audio set (e.g., a Dolby 5.1 audio set, a Dolby 7.1 audio set, or any other suitable type of audio set) along with a 3D8 data set.
  • An audio set is a set of channels driving a corresponding set of speakers.
  • processing circuitry 126 selectively delivers a set of four camera views out of the available eight to the screen. The selection of the four camera views is based on viewer reference information.
  • processing circuitry 126 may choose cameras 2 - 5 , then cameras 3 - 6 , then cameras 4 - 7 , and finally cameras 5 - 8 when the viewer has reached the far right of adaptable screen assembly 122 .
  • an adaptive light manipulator portion of adaptable screen assembly 122 will operate to enhance the 3D visual experience. It will be recognized that this can be done with only single 3D4 video and a 3D4 screen without substituting cameras and via merely the light manipulator functionality.
  • the received single audio set can be manipulated by processing circuitry 126 to create a more realistic 3D experience in synchrony with such changing video.
  • multiple portions of further audio data can be made available to processing circuitry 126 from which a set can be selected or produced and balanced based on the viewer's reference information.
  • Such multiple portions of further audio data may be captured and/or generated in any of a variety of ways.
  • a plurality of audio sets can be captured or constructed to service a selected set of viewer locations.
  • processing circuitry 126 can migrate between both the camera selections and the plurality of audio sets as the viewer moves from left to right as mentioned above, and receive a more accurate real world sensory experience. Transitions between audio sets might be smoothed via a weighted combination of channels of two adjacent audio sets.
  • each of the target microphones 154 A- 154 D, 156 A- 156 D, and 158 A- 158 D may be a stereo microphone, and each may produce two channels. It is noted that rather than such channels being captured, the channels can be produced without physical microphones. For instance, the channels may be produced via software using the same reference points as the illustrated microphones that surround each point of origin.
  • a plurality of audio sets can be produced and delivered downstream to processing circuitry 126 for selection therefrom to support a current viewer point of reference. Less significant sounds such as background music or background noise can be captured or produced as well via the background microphones 162 A- 162 D, which also may be stereo microphones that each produce two channels.
  • processing circuitry 126 may be provided with all of the microphone channels related to the background microphones 162 A- 162 D and the audio target microphones 154 A- 154 D, 156 A- 156 D, and 158 A- 158 D. With such access, processing circuitry 126 can generate the audio sets itself that perhaps provides a more realistic 3D sensory environment.
  • processing circuitry 126 may receive all of the sets or audio data from all of the microphones 162 A- 162 D, 154 A- 154 D, 156 A- 156 D, and 158 A- 158 D, processing circuitry 126 may still “balance” the audio set output to conform to the viewer's position with reference to the actual viewing/listening room layout.
  • FIG. 2 shows an audio system 200 , which is an exemplary viewer-located implementation of audio system 128 shown in FIG. 1 , in accordance with an embodiment.
  • audio system 200 includes a support element 202 , a right ear speaker assembly 204 a, and a left ear speaker assembly 204 b.
  • Support element is configured to be placed on or around a viewer's head.
  • Right ear speaker assembly 204 a is configured to be placed proximate the viewer's right ear.
  • Left ear speaker assembly 204 b is configured to be placed proximate the viewer's left ear.
  • the right and left ear speaker assemblies 204 a and 204 b may include respective portions of a plurality of speakers.
  • left ear speaker assembly 204 b is shown to include speakers 130 d, 130 b, and 130 g of audio system 128 in FIG. 1 for illustrative purposes.
  • Speakers 130 d, 130 b, and 130 g provide respective portions 132 d, 132 b, and 132 g of audio content to the viewer's left ear.
  • Speaker 130 d is configured to be positioned toward a back edge of the viewer's left ear to cause the viewer to perceive audio content 132 d as originating from the left rear of the viewer.
  • Speaker 130 b is configured to be positioned toward a side of the viewer's left ear to cause the viewer to perceive audio content 132 b as originating from the left side of the viewer.
  • Speaker 130 g is configured to be positioned toward a front edge of the viewer's left ear to cause the viewer to perceive audio content 132 g as originating from the left front of the viewer.
  • Left ear speaker assembly 204 b further includes speakers 230 f and 230 h.
  • Speaker 230 f provides portion 132 f of the audio content to the viewer's left ear.
  • Speaker 230 f is configured to be positioned toward the front edge of the viewer's left ear.
  • a corresponding speaker is included in speaker assembly 204 a and configured to be positioned toward a front edge of the viewer's right ear to provide portion 132 f of the audio content to the viewer's right ear.
  • the provision of portion 132 f of the audio content is intended to cause the viewer to perceive the portion 132 f as originating in front of the viewer.
  • Speaker 120 h is intended to serve as a subwoofer that provides relatively low frequency portion of the audio content to the viewer's left ear.
  • the placement of speaker 120 h in speaker assembly 204 b may not substantially affect the viewer's perception of the originating location of the low-frequency portion of the audio content.
  • Right ear speaker assembly 204 a includes speakers that are complimentary to those that are included in left ear speaker assembly 204 b, though not shown in FIG. 2 .
  • the right ear speaker assembly 204 a may include speakers that are configured similarly to speakers 130 d, 130 b, and 130 g in left ear speaker assembly 204 b to provide respective audio portions 132 c, 132 a, and 132 e of the audio content.
  • Right ear speaker assembly 204 a may also include a speaker corresponding to speaker 230 h in left ear speaker assembly 204 b to provide the low-frequency portion of the audio content to the viewer's right ear.
  • Speaker assembly 200 further includes a receiver 206 , a battery 208 , a transmitter 210 , a right ear orientation element 210 a, a left ear orientation element 210 b, and circuitry 212 .
  • Right ear orientation element 210 a corresponds to (e.g., is attached to or is incorporated in) right ear speaker assembly 204 a and left ear orientation element 210 b that corresponds to (e.g., is attached to or is incorporated in) left ear speaker assembly 204 b.
  • Right ear orientation element 210 a is configured to at least assist in determining a position of the viewer's right ear.
  • Left ear orientation element 210 b is configured to at least assist in determining a position of the viewer's left ear.
  • Right and left ear orientation elements 210 a and 210 b operate in conjunction to at least assist in detection of an orientation of the viewer's ears or head.
  • right and left ear orientation elements 210 a and 210 b may be included in reference information generation circuitry 110 a of FIG. 1 . Further discussion of some techniques that may utilize right and left ear orientation elements 210 a and 210 b for detecting an orientation of the viewer's ears or head are discussed below with respect to FIGS. 7 and 8 .
  • Transmitter 210 is configured to transmit information regarding the orientation of the viewer, such as information regarding the location of the viewer's left ear, information regarding the location of the viewer's right ear, and/or information regarding the orientation of the viewer's ears or head, for further processing (e.g., by reference information generation circuitry 110 b and/or processing circuitry 126 of FIG. 1 ).
  • Circuitry 212 includes a portion of driver circuitry 124 shown in FIG. 1 for controlling the speakers that are included in right and left speaker assemblies 204 a and 204 b. Circuitry 212 controls the speakers based on control signals that are received from processing circuitry 126 via receiver 206 .
  • processing circuitry 126 may include a transmitter (not shown in FIG. 1 ) for transmitting the control signals to speaker assembly 200 .
  • Processing circuitry 126 may transmit the control signals via a wired or wireless communication pathway.
  • circuitry 212 modifies the control signals that are received from processing circuitry 126 to take into consideration the orientation of the viewer. In accordance with this embodiment, circuitry 212 controls the speakers based on the modified control signals.
  • transmitter 210 transmits information regarding the orientation of the viewer to reference information generation circuitry 110 b.
  • Reference information generation circuitry 110 b provides the information regarding the orientation of the viewer to processing circuitry 126 , which modifies control signals based on the orientation of the viewer and transmits those modified control signals to speaker assembly 200 .
  • circuitry 212 controls the speakers based on the modified control signals, which are received from processing circuitry 126 via receiver 206 . Battery provides power to the various speakers.
  • Audio system 200 is shown in FIG. 2 to be implemented as headphones for illustrative purposes and is not intended to be limiting. Audio system 200 may be implemented using any suitable viewer-located or viewer-remote system, or a combination thereof. For example, audio system 200 may be implemented at least in part as eyewear that has speaker assemblies 204 a and 204 B attached thereto.
  • FIGS. 3-8 Various embodiments of media system 102 of FIG. 1 will now be described in reference to FIGS. 3-8 .
  • Each of these embodiments utilizes different implementations of reference information generation circuitry 110 a and 110 b to produce reference information for provision to processing circuitry 126 .
  • These different implementations are described herein by way of example only and are not intended to be limiting.
  • portions 132 a - 132 g of audio content are referred to cumulatively as audio content 132 .
  • FIG. 3 is a block diagram of a first embodiment of media system 102 in which reference information generation circuitry 110 a and 110 b jointly implement a triangulation technique to determine an estimated location of viewer 106 relative to adaptable screen assembly 122 .
  • reference information generation circuitry 110 a includes a transmitter 306 that is operable to transmit a location tracking signal 308 .
  • Location tracking signal 308 may comprise, for example, a radio frequency (RF) signal or other wireless signal.
  • RF radio frequency
  • reference information generation circuitry 110 b includes a plurality of receivers 302 1 - 302 N and triangulation circuitry 304 connected thereto.
  • Receivers 302 1 - 302 N are operable to receive corresponding versions 310 1 - 310 N of location tracking signal 308 .
  • Triangulation circuitry 304 is operable to determine an estimated location of viewer 106 based on characteristics of the received versions 310 1 - 310 N of location tracking signal 308 . For example, triangulation circuitry 304 may determine the estimated location of viewer 106 by measuring relative time delays between the received versions 310 1 - 310 N of location tracking signal 308 , although this is only an example. The estimated location of viewer 106 is then provided by triangulation circuitry 304 to processing circuitry 126 as part of the above-described reference information.
  • Transmitter 306 is operable to transmit location tracking signal 308 on an on-going basis.
  • transmitter 306 may be configured to automatically transmit location tracking signal 308 on a periodic or continuous basis.
  • transmitter 306 may intermittently transmit location tracking signal 308 responsive to certain activities of viewer 106 or other events.
  • Triangulation circuitry 304 is operable to calculate an updated estimate of the location of viewer 106 based on the corresponding versions 310 1 - 310 N of location tracking signal 308 received over time.
  • reference information generation circuitry 110 a comprises viewer-located circuitry
  • triangulation circuitry 304 will be able to produce updated estimates of the location of viewer 106 and provide such updated estimates to processing circuitry 126 .
  • Processing circuitry 126 will then cause modification of at least one of the one or more adaptable display characteristics of adaptable screen assembly 122 so that three-dimensional content will be displayed in a manner that is suitable or optimized for viewing at the current estimated location of viewer 106 .
  • processing circuitry 126 will cause modification of a spatial orientation of audio content 132 to indicate the current estimated location of viewer 106 .
  • each of the receivers 302 1 - 302 N may be included at fixed spatially-dispersed locations within a single housing, and the housing may be placed in a particular location to achieve satisfactory or optimal results.
  • separate housings may be used to contain different ones of receivers 302 1 - 302 N and may be placed at different locations in or around the viewing area to achieve satisfactory or optimal results.
  • one or more of the receivers 302 1 - 302 N may be included in (or attached to) one or more speaker assemblies that are included in audio system 128 .
  • FIG. 4 is a block diagram of a second embodiment of media system 102 in which reference information generation circuitry 110 a and 110 b jointly implement a triangulation technique to determine an estimated location of viewer 106 relative to adaptable screen assembly 122 .
  • reference information generation circuitry 110 b includes a plurality of transmitters 402 1 - 402 N that are operable to transmit a corresponding location tracking signal 412 1 - 412 N .
  • Location tracking signals 412 1 - 412 N may comprise, for example, RF signals or other wireless signals.
  • FIG. 4 is a block diagram of a second embodiment of media system 102 in which reference information generation circuitry 110 a and 110 b jointly implement a triangulation technique to determine an estimated location of viewer 106 relative to adaptable screen assembly 122 .
  • reference information generation circuitry 110 b includes a plurality of transmitters 402 1 - 402 N that are operable to transmit a corresponding location tracking signal 412 1 - 412 N .
  • reference information generation circuitry 110 a includes a plurality of receivers 406 1 - 406 N and triangulation circuitry 408 connected thereto.
  • Receivers 406 1 - 406 N are operable to receive corresponding location tracking signals 412 1 - 412 N .
  • Triangulation circuitry 408 is operable to determine an estimated location of viewer 106 based on characteristics of the received location tracking signals 412 1 - 412 N . For example, triangulation circuitry 408 may determine the estimated location of viewer 106 by determining a distance to each of transmitters 402 1 - 402 N based on the location signals received therefrom, although this is only an example.
  • the estimated location of viewer 106 is then provided by triangulation circuitry 508 to reference information generation circuitry 110 b via a wired or wireless communication channel established between a transmitter 410 of reference generation circuitry 110 a and a receiver 404 of reference information generation circuitry 110 b.
  • Reference information generation circuitry 110 b then provides the estimated location of viewer 106 to processing circuitry 126 as part of the above-described reference information.
  • Transmitters 402 1 - 402 N are operable to transmit location tracking signals 412 1 - 412 N on an on-going basis.
  • transmitters 402 1 - 402 N may be configured to automatically transmit location tracking signals 412 1 - 412 N on a periodic or continuous basis.
  • transmitters 402 1 - 402 N may intermittently transmit location tracking signals 412 1 - 412 N responsive to certain activities of viewer 106 or other events.
  • Triangulation circuitry 408 is operable to calculate an updated estimate of the location of viewer 106 based on the versions of location tracking signals 412 1 - 412 N received over time.
  • reference information generation circuitry 110 a comprises viewer-located circuitry
  • triangulation circuitry 408 will be able to produce updated estimates of the location of viewer 106 and provide such updated estimates to reference information generation circuitry 110 b for forwarding to processing circuitry 126 .
  • Processing circuitry 126 will then cause modification of at least one of the one or more adaptable display characteristics of adaptable screen assembly 122 so that three-dimensional content will be displayed in a manner that is suitable or optimized for viewing at the current estimated location of viewer 106 .
  • processing circuitry 126 will cause modification of a spatial orientation of audio content 132 to indicate the current estimated location of viewer 106 .
  • each of the transmitters 402 1 - 402 N may be included at fixed locations within a single housing, and the housing may be placed in a particular location to achieve satisfactory or optimal results.
  • separate housings may be used to contain different ones of transmitters 402 1 - 402 N and may be placed at different locations in or around the viewing area to achieve satisfactory or optimal results.
  • one or more of the transmitters 402 1 - 402 N may be included in (or attached to) one or more speaker assemblies that are included in audio system 128 .
  • FIG. 5 is a block diagram of a further embodiment of media system 102 in which reference information generation circuitry 110 a and 110 b jointly implement an infrared (IR) distance measurement system to help determine an estimated location of viewer 106 relative to adaptable screen assembly 122 .
  • reference information generation circuitry 110 b includes one or more IR light sources 502 and reference information generation circuitry 110 a includes one or more IR sensors 506 .
  • IR sensor(s) 506 are configured to sense IR light 508 emitted by IR light source(s) 502 and to analyze characteristics associated with such light to help generate information concerning an estimated location of viewer 106 with respect to adaptable screen assembly 122 .
  • the estimated location of viewer 106 may then be provided by reference information generation circuitry 110 a to reference information generation circuitry 110 b via a wired or wireless communication channel established between a transmitter 510 of reference generation circuitry 110 a and a receiver 504 of reference information generation circuitry 110 b.
  • Reference information generation circuitry 110 b then provides the estimated location of viewer 106 to processing circuitry 126 as part of the above-described reference information.
  • Processing circuitry 126 will then cause modification of at least one of the one or more adaptable display characteristics of adaptable screen assembly 122 so that three-dimensional content will be displayed in a manner that is suitable or optimized for viewing at the current estimated location of viewer 106 .
  • processing circuitry 126 will cause modification of a spatial orientation of audio content 132 to indicate the current estimated location of viewer 106 .
  • the IR distance measurement system may be implemented by incorporating one or more IR light sources into reference information generation circuitry 110 a and incorporating one or more IR sensors into reference information generation circuitry 110 b.
  • reference information generation circuitry 110 b includes one or more IR light sources for projecting IR light toward the viewing area and one or more IR sensors for sensing IR light reflected from objects in the viewing area. Characteristics of the IR light reflected from the objects in the viewing area may then be analyzed to help estimate a current location of viewer 106 .
  • a like system could also be implemented by reference information generation circuitry 110 a, except that the IR light would be projected out from the viewer's location instead of toward the viewing area. Still other IR distance measurement systems may be used to generate the aforementioned reference information.
  • FIG. 6 is a block diagram of a further embodiment of media system 102 in which reference information generation circuitry 110 a and 110 b jointly implement a magnetic field detection system to help determine an estimated location of viewer 106 relative to adaptable screen assembly 122 .
  • reference information generation circuitry 110 a includes one or more magnetic field sources 604 and reference information generation circuitry 110 b includes one or more magnetic field sensors 602 .
  • Magnetic field sensor(s) 602 are configured to sense a magnetic field(s) generated by magnetic field source(s) 604 and to analyze characteristics associated therewith to help generate information concerning an estimated location of viewer 106 with respect to adaptable screen assembly 122 .
  • the estimated location of viewer 106 may then be provided by reference information generation circuitry 110 b to processing circuitry 126 as part of the above-described reference information. Processing circuitry 126 will then cause modification of at least one of the one or more adaptable display characteristics of adaptable screen assembly 122 so that three-dimensional content will be displayed in a manner that is suitable or optimized for viewing at the current estimated location of viewer 106 . In addition or alternatively, processing circuitry 126 will cause modification of a spatial orientation of audio content 132 to indicate the current estimated location of viewer 106 .
  • the magnetic field detection system may be implemented by incorporating one or more magnetic field sources into reference information generation circuitry 110 b and incorporating one or more magnetic field sensors into reference information generation circuitry 110 a.
  • reference information generation circuitry 110 b includes one or more magnetic field sources for projecting magnetic fields toward the viewing area and one or more magnetic field sensors for sensing magnetic fields reflected from objects in the viewing area. Characteristics of the magnetic fields reflected from the objects in the viewing area may then be analyzed to help estimate a current location of viewer 106 .
  • a like system could also be implemented by reference information generation circuitry 110 a, except that the magnetic fields would be projected out from the viewer's location instead of toward the viewing area. Still other magnetic field detection systems may be used to generate the aforementioned reference information.
  • FIG. 7 is a block diagram of a further embodiment of media system 102 in which reference information generation circuitry 110 a includes one or more cameras and one or more microphones for facilitating the generation of the aforementioned reference information.
  • reference information generation circuitry 110 a includes one or more cameras 708 , one or more microphones 710 , and a transmitter 712 .
  • Camera(s) 708 operate to capture images of the viewing environment of viewer 106 and are preferably carried or mounted on viewer 106 in such a manner so as to capture images that correspond to a field of vision of viewer 106 . These images are then transmitted by transmitter 712 to a receiver 702 in reference information generation circuitry 110 b via a wired or wireless communication channel. Such images are then processed by image processing circuitry 704 within reference information generation circuitry 110 b. Image processing circuitry 704 may process such images to determine a current estimated location and/or head orientation of viewer 106 .
  • image processing circuitry 704 may compare such images to one or more reference images in order to determine a current estimated location and/or head orientation of viewer 106 .
  • the reference images may comprise, for example, images of adaptable screen assembly 122 , speakers that are included in audio system 128 , and/or other objects or points of interest normally viewable by a viewer of media system 102 captured from one or more locations and at one or more orientations within the viewing area.
  • image processing circuitry 704 may calculate measurements associated with representations of objects or points of interest captured in such images and then compare those measurements to known measurements associated with the objects or points of interest to determine a current estimated location and/or head orientation of viewer 106 . Still other techniques may be used to process such images to determine an estimated current location and/or head orientation of viewer 106 .
  • Image processing circuitry 704 then provides the estimated location and/or head orientation of viewer 106 to processing circuitry 126 as part of the above-described reference information. Processing circuitry 126 will then cause modification of at least one of the one or more adaptable display characteristics of adaptable screen assembly 122 so that three-dimensional content will be displayed in a manner that is suitable or optimized for viewing at the current estimated location and/or in accordance with the current estimated head orientation of viewer 106 . In addition or alternatively, processing circuitry 126 will cause modification of a spatial orientation of audio content 132 to indicate the current estimated location and/or current estimated head orientation of viewer 106 .
  • images captured by camera(s) 708 and/or processed by image processing circuitry 704 need not comprise images of the type intended for viewing by human eyes. Rather, such images may comprise images of a resolution or frequency range that is beyond the rods/cones capability of the human eye.
  • images of adaptable screen assembly 122 captured by camera(s) 708 are processed by image processing circuitry 704 to determine or measure one or more qualities relating to how adaptable screen assembly 122 is currently presenting two-dimensional or three-dimensional content to viewer 106 .
  • Such qualities may include but are not limited to image sharpness, brightness, contrast, resolution, and colors.
  • Image processing circuitry 704 provides information concerning the determined or measured qualities to processing circuitry 126 . If processing circuitry 126 determines that a particular quality of the presentation is not acceptable, processing circuitry 126 can implement changes to one or more of the adaptable display characteristics of adaptable screen assembly 122 to adjust that particular quality until it is deemed acceptable. In this manner, media system 102 can implement an image-based feedback mechanism for improving the quality of presentation of two-dimensional and three-dimensional content to a viewer.
  • Microphone(s) 710 included within reference information generation circuitry 110 a operate to capture one or more audio signal(s) which are transmitted by transmitter 712 to receiver 702 in reference information generation circuitry 110 b. Such audio signal(s) are then processed by audio processing circuitry 706 within reference information generation circuitry 110 b. Audio processing circuitry 706 may process such audio signal(s) to determine a current estimated location and/or head orientation of viewer 106 . For example, audio processing circuitry 706 may process such audio signal(s) to determine a direction of arrival associated with one or more speakers of audio system 128 that are located in or around the viewing environment. Such directions of arrival may then be utilized to estimate a current location and/or head orientation of viewer 106 .
  • Audio processing circuitry 706 then provides the estimated location and/or head orientation of viewer 106 to processing circuitry 126 as part of the above-described reference information. Processing circuitry 126 will then cause modification of at least one of the one or more adaptable display characteristics of adaptable screen assembly 122 so that three-dimensional content will be displayed in a manner that is suitable or optimized for viewing at the current estimated location and/or in accordance with the current estimated head orientation of viewer 106 . In addition or alternatively, processing circuitry 126 will cause modification of a spatial orientation of audio content 132 to indicate the current estimated location and/or current estimated head orientation of viewer 106 .
  • audio signal(s) captured by microphone(s) 710 are processed by audio processing circuitry 706 to determine or measure one or more qualities relating to how audio system 128 is currently presenting audio content 132 to viewer 106 .
  • Such qualities may include but are not limited to loudness, balance, surround-sound, delay, and audio spatial orientation performance.
  • Audio processing circuitry 706 provides information concerning the determined or measured qualities to processing circuitry 126 . If processing circuitry 126 determines that a particular quality of the presentation is not acceptable, processing circuitry 126 can implement changes to one or more settings or characteristics of audio system 128 to adjust that particular quality until it is deemed acceptable. In this manner, media system 102 can implement an audio-based feedback mechanism for improving the quality of presentation of audio content 132 to a viewer.
  • microphone(s) 710 may be used to allow viewer 106 to deliver voice commands for controlling certain aspects of media system 102 , including the manner in which two-dimensional and three-dimensional content is presented via adaptable screen assembly 122 .
  • audio processing circuitry 706 may comprise circuitry for recognizing and extracting such voice commands from the audio signal(s) captured by microphone(s) 710 and passing the commands to processing circuitry 126 .
  • processing circuitry 126 may cause a spatial orientation of audio content 132 and/or at least one of the one or more adaptable display characteristics of adaptable screen assembly 122 relating to the presentation of two-dimensional or three-dimensional content to be modified.
  • voice commands may be used for other purposes as well, including controlling what audio content is provided to viewer 106 via audio system 128 and//or what visual content is delivered to viewer 106 via adaptable screen assembly 122 .
  • image processing circuitry 704 and audio processing circuitry 706 are shown as part of reference information generation circuitry 110 b. It is noted that in alternate embodiments, such circuitry may instead be included within reference information generation circuitry 110 a. In accordance with still further embodiments, image processing circuitry and/or audio processing circuitry may be distributed among reference information generation circuitry 110 a and 110 b.
  • FIG. 8 is a block diagram of a further embodiment of media system 102 in which reference information generation circuitry 110 a includes a head orientation sensor 808 and eye tracking circuitry 806 for determining a head orientation and point of gaze, respectively, of viewer 106 .
  • Head orientation sensor(s) 808 may include, for example and without limitation, an accelerometer or other device designed to detect motion in three-dimensions or tilting in a two-dimensional reference plane.
  • Eye tracking circuitry 806 may comprise any system or device suitable for tracking the motion of the eyes of a viewer to determine a point of gaze therefrom.
  • the determined head orientation and point of gaze of viewer 106 is transmitted by a transmitter 804 included in reference information generation circuitry 110 a to a receiver 802 included in reference information generation circuitry 110 b via a wired or wireless communication channel.
  • the determined head orientation and point of gaze of viewer 106 may then be provided by reference information generation circuitry 110 b to processing circuitry 126 as part of the above-described reference information.
  • Processing circuitry 126 will then cause modification of at least one of the one or more adaptable display characteristics of adaptable screen assembly 122 so that three-dimensional content will be displayed in a manner that is suitable or optimized for viewing by viewer 106 in light of the determined head orientation and/or point of gaze of viewer 106 .
  • processing circuitry 126 will cause modification of a spatial orientation of audio content 132 to indicate the current estimated head orientation and/or current estimated point of gaze of viewer 106 .
  • FIG. 9 is a block diagram of a further embodiment of media system 102 in which reference information is generated entirely by non-viewer-located reference information generation circuitry 110 .
  • reference information generation circuitry 110 includes one or more camera(s) 902 and one or more microphone(s) 904 .
  • Camera(s) 902 operate to capture images of a viewing area in front of adaptable screen assembly 122 .
  • the images may be captured using ambient light or, alternatively, reference information generation circuitry 110 may include one or more light sources (e.g., IR light sources or other types of light sources) that operate to radiate light into the viewing area so that camera(s) 902 may capture light reflected from people and objects in the viewing area.
  • the images captured by camera(s) 902 are processed by image/audio processing circuitry 906 to determine an estimated location of viewer 106 .
  • microphone(s) 904 operate to capture audio signals that include content 132 from speakers that are included in audio system 128 . For instance, such speakers may be located in and around the viewing area in front of adaptable screen assembly 122 .
  • the audio signals captured by microphone(s) 904 are also processed by image/audio processing circuitry 906 to determine an estimated location of viewer 106 .
  • Image/audio processing circuitry 906 then provides the estimated location of viewer 106 to processing circuitry 126 as part of the above-described reference information.
  • Processing circuitry 126 will then cause modification of at least one of the one or more adaptable display characteristics of adaptable screen assembly 122 so that three-dimensional content will be displayed in a manner that is suitable or optimized for viewing at the current estimated location of viewer 106 .
  • processing circuitry 126 will cause modification of a spatial orientation of audio content 132 to indicate the current estimated location of viewer 106 .
  • the audio signal(s) captured by microphone(s) 904 may also be processed to detect and extract voice commands uttered by viewer 106 , such voice commands being executed by processing circuitry 126 to facilitate viewer control over a spatial orientation of audio content 132 and/or at least one of the one or more adaptable display characteristics of adaptable screen assembly 122 relating to the presentation of two-dimensional or three-dimensional content.
  • voice commands may be used for other purposes as well, including controlling what audio content is provided to viewer 106 via audio system 128 and//or what visual content is delivered to viewer 106 via adaptable screen assembly 122 .
  • reference information generation circuitry may be used to produce reference information corresponding to at least one positional characteristic of a viewing reference of a viewer.
  • UWB ultra wide band
  • information generation circuitry may include UWB circuitry (e.g., UWB receivers and/or UWB transmitters).
  • UWB circuitry e.g., UWB receivers and/or UWB transmitters.
  • certain features of the reference information generation circuitry described in regard to FIGS. 3-9 may be combined to produce additional embodiments. For example, an embodiment may utilize a combination of triangulation, IR distancing, head orientation sensing and eye tracking to generate extremely precise reference information concerning a viewing reference of a viewer.
  • FIG. 10 depicts an exemplary headset 1000 that may implement various features of reference information generation circuitry 110 a as described above.
  • headset 1000 includes a frame 1002 that supports a right lens 1004 A and a left lens 1004 B.
  • Right lens 1004 A and left lens 1004 B may comprise, for example, colored, polarizing, or shuttering lenses that enable a viewer to perceive certain types of three-dimensional content delivered by certain types of screen assemblies.
  • lenses 1004 A and 1004 B can be used to control how video content that is rendered to a screen assembly is perceived by a viewer.
  • frame 1002 is mounted in such a manner that it can be flipped up and away from the eyes of a viewer wearing headset 1000 when the viewer does not desire or need to use such lenses.
  • headset 1000 also includes a right speaker assembly 1006 A and a left speaker assembly 1006 B, each of which may include one or more speakers.
  • Such speakers can be used to deliver audio content to a viewer of a screen assembly, wherein the audio content is related to video content displayed on the screen assembly and viewable by the viewer.
  • Such speakers can deliver other types of audio content, as well.
  • right speaker assembly 1006 A and left speaker assembly 1006 B include right ear speaker assembly 204 a and left ear speaker assembly 204 b, respectively, as described above with reference to FIG. 2 .
  • Headset 1000 further includes a microphone 1002 .
  • Microphone 1002 may be used to support any of the functionality described above in reference to microphone(s) 710 of FIG. 7 .
  • headset 1000 may include additional microphones beyond microphone 1002 .
  • Headset 1000 still further includes a battery compartment 1010 for housing one or more batteries. Such battery or batteries may or may not be rechargeable depending upon the implementation.
  • headset 1000 includes an interface for connecting to an external power source. The connection to the external power source may be made to deliver power to headset 1000 as well as to recharge the battery or batteries stored in battery compartment 1010 .
  • Headset 1000 further includes a forward housing 1002 , a right-side housing 1008 and a left-side housing 1012 . These housings may be used to store any number of the various types of viewer-located reference information generation circuitry described above in reference to FIGS. 3-8 .
  • forward housing 1002 houses one or more cameras that support any of the functionality described above in reference to camera(s) 708 of FIG. 7
  • each side housing stores a transmitter or receiver used for implementing a triangulation system for determining a location of a viewer wearing headset 1000 .
  • Such compartments may also be used to house circuitry used for implementing features relating to IR distancing, magnetic field detection, head orientation sensing, eye tracking, or the like, as described above.
  • Headset 1000 of FIG. 10 is merely one example of how reference information generation circuitry 110 a may be worn by or carried by a user. Any of a wide variety of portable or wearable articles or devices may be used. Furthermore, reference information generation circuitry 110 a may be distributed among multiple articles or devices worn or carried by a viewer.
  • FIG. 11A depicts an implementation in which reference information generation circuitry 110 a is distributed among a headset 1102 and a remote control 1104 , which are connected to each other by a wired communication link in the form of a cable 1106 .
  • remote control 1104 may be connected to other portions of a media system (e.g., reference information generation circuitry 110 b ) via a wireless communication link and act as a conduit for communication between headset 1102 and such other portions.
  • headset 1102 may include speakers for delivering audio content to a viewer, as well as one or more cameras, a head orientation sensor, and an eye movement tracker which support functionality for determining a location, head orientation and point of gaze of a viewer, as described above.
  • Remote control 1104 Information generated by these components can be delivered via remote control 1104 to other components of the media system for processing and/or can be processed by components within remote control 1104 .
  • Remote control 1104 may also include circuitry for supporting the production of reference information, such as for example, receivers or transmitters used to support triangulation-based viewer location, microphones, or the like.
  • FIG. 11B depicts a different implementation in which reference information generation circuitry 110 a is distributed among a headset 1112 and a laptop computer 1114 , which are connected to each other by a wireless communication link 1116 , such as for example a Bluetooth® connection.
  • laptop computer 1114 may be connected to other portions of a media system (e.g., reference information generation circuitry 110 b ) via a wired or wireless communication link and act as a conduit for communication between headset 1112 and such other portions.
  • headset 1112 may include speakers for delivering audio content to a viewer as well as a microphone for capturing voice commands from a viewer and/or other audio content.
  • Headset 1112 may also include one or more cameras, a head orientation sensor, and an eye movement tracker that support functionality for determining a location, head orientation and point of gaze of a viewer as described above. Information generated by these components can be delivered via laptop computer 1114 to other components of the media system for processing and/or can be processed by components within laptop computer 1114 .
  • Laptop computer 1114 may also include circuitry for supporting the production of reference information, such as for example, receivers or transmitters used to support triangulation-based viewer location, additional cameras and microphones, or the like.
  • FIGS. 11A and 11B provide merely a few examples of how reference information generation circuitry 110 a may be distributed among multiple articles or devices worn or carried by a viewer. These examples are not intended to be limiting. Persons skilled in the relevant art(s) will appreciate that a large number of other variations may be used.
  • FIG. 12 depicts a flowchart 1200 of a method for presenting three-dimensional content to a viewer in accordance with an embodiment.
  • the method of flowchart 1200 will be described herein with continued reference to media system 102 of FIG. 1 . However, the method is not limited to that system.
  • the method of flowchart 1200 begins at step 1202 , in which first circuitry at least assists in producing reference information corresponding to at least one positional characteristic of an orientation of a viewer.
  • the first circuitry may comprise, for example, reference information generation circuitry 110 a and/or reference information generation circuitry 110 b as described above.
  • the orientation of the viewer may comprise any of a number of positional characteristics that affect how three-dimensional visual content displayed via an adaptable screen assembly and/or audio content that is associated therewith will be perceived by the viewer.
  • positional characteristics may include, for example and without limitation, a position or location of the viewer relative to the adaptable screen assembly, a head orientation of the viewer and a point of gaze of the viewer.
  • the reference information may be produced using any of the approaches previously described herein as well as additional approaches not described herein.
  • step 1204 the reference information produced during step 1202 is provided to second circuitry.
  • the second circuitry issues one or more control signals to cause modification of at least one of one or more adaptable display characteristics of an adaptable screen assembly.
  • the modification corresponds at least in part to the reference information.
  • the second circuitry may comprise, for example, processing circuitry 126 as described above. Such processing circuitry 126 may cause the modification of the at least one of the one or more adaptable display characteristics by sending one or more suitable control signals to driver circuitry 124 .
  • the one or more adaptable display characteristics may include, but are not limited to, a configuration of an adaptable light manipulator that forms part of the adaptable screen assembly, a manner in which images are mapped to display pixels in a pixel array that forms part of the adaptable screen assembly, and/or a distance and angular alignment between such an adaptable light manipulator and such a pixel array.
  • the second circuitry issues one or more control signals to cause modification of a spatial orientation of audio content.
  • the modification corresponds at least in part to the reference information.
  • the second circuitry may comprise, for example, processing circuitry 126 , which may cause the modification of the spatial orientation of the audio content by sending one or more suitable control signals to driver circuitry 124 .
  • the modification of the spatial orientation may include modification of an amplitude of sound that corresponds to a specified sound source that corresponds to (e.g., is depicted in) a three-dimensional presentation that is supported by the adaptable screen assembly.
  • the modification of the spatial orientation may include modification of a delay that is associated with such sound.
  • FIG. 13 depicts a flowchart 1300 of a method for delivering video output and audio output to a viewer based at least in part on positional characteristic(s) relating to an orientation of the viewer in accordance with an embodiment.
  • the method of flowchart 1300 will be described herein with continued reference to media system 102 of FIG. 1 . However, the method is not limited to that system.
  • the method of flowchart 1300 begins at step 1302 , in which at least one positional characteristic relating to a first orientation of a viewer within a premises is identified.
  • positional characteristics may include, for example and without limitation, a position or location of the viewer relative to one or more objects (e.g., an adaptable screen assembly, one or more speakers, etc.) in the premises, a head orientation of the viewer, and/or a point of gaze of the viewer.
  • the position or location of the viewer relative to an object i.e., the relative position or location of the viewer
  • the position or location of the viewer may include a distance from the object or some reference point associated therewith, and such distance may include both horizontal distance and elevation.
  • the position or location of the viewer may also include eye locations of the viewer.
  • the head orientation of the viewer may include a degree of tilt and/or rotation of the head of the viewer.
  • reference information generation circuitry 110 a and/or reference information generation circuitry 110 b identifies the at least one positional characteristic relating to a first orientation of viewer 106 .
  • a video output which is tailored based at least in part on the at least one positional characteristic, for a three-dimensional visual presentation is delivered to the viewer in the first orientation.
  • adaptable screen assembly 122 delivers the video output for a three-dimensional visual presentation to viewer 106 in the first orientation.
  • audio output is tailored based at least in part on the at least one positional characteristic.
  • tailoring the audio output may include selecting the audio output from a plurality of audio outputs.
  • each of the plurality of audio outputs may correspond to a respective three-dimensional view, a respective designated (e.g., predetermined) location and/or respective designated (e.g., predetermined) head orientation of the viewer, etc.
  • the audio output may be selected based on a signal that is generated in response to input from the viewer.
  • tailoring the audio output may include generating the audio output.
  • tailoring the audio output may include changing an amplitude of the audio output.
  • tailoring the audio output may include adding a delay to the audio output or removing a delay from the audio output.
  • driver circuitry 124 tailors audio output corresponding to audio content 132 based at least in part on the at least one positional characteristic in accordance with control signals that are received from processing circuitry 126 .
  • tailoring the audio output may be based at least in part on image data that is captured within the premises.
  • the image data may be captured by cameras 708 as described above with reference to FIG. 7 and/or by cameras 1002 as described above with reference to FIG. 10 .
  • the image data may be captured from a perspective of the viewer and/or from a perspective that is directed toward the viewer.
  • tailoring the audio output may be based at least in part on captured audio data that is captured within the premises.
  • the audio data may be captured by microphones 710 as described above with reference to FIG. 7 and/or by microphones 1004 as described above with reference to FIG. 10 .
  • the audio data may be captured from a perspective of the viewer and/or from a perspective that is directed toward the viewer.
  • the audio output which is tailored based at least in part on the at least one positional characteristic, is delivered to audibly supplement the video output for the viewer in the first orientation.
  • audio system 128 delivers the audio output to audibly supplement the video output for viewer 106 in the first orientation.
  • one or more steps 1302 , 1304 , 1306 , and/or 1308 of flowchart 1300 may not be performed. Moreover, steps in addition to or in lieu of steps 1302 , 1304 , 1306 , and/or 1308 may be performed.
  • FIG. 14 depicts a flowchart 1400 of a method for delivering an audio experience for ears of a listener via a plurality of speakers in accordance with an embodiment.
  • the method of flowchart 1400 will be described herein with continued reference to media system 102 of FIG. 1 . However, the method is not limited to that system.
  • the method of flowchart 1400 begins at step 1402 , in which ears of a listener are detected to be in a first orientation with respect to a plurality of speakers.
  • reference information generation circuitry 110 a and/or reference information generation circuitry 110 b detects that the ears of viewer 106 are in a first orientation with respect to speakers 130 a - 130 g.
  • first audio output that is based on audio content is delivered to attempt to establish, with a spatial orientation of the audio content, an audio experience for the ears of the listener in the first orientation.
  • the spatial orientation of the audio content may be configured to accord with the ears of the listener being in the first orientation to provide the first audio output.
  • audio system 128 delivers the first audio output that is based on the audio content to attempt to establish the audio experience for the ears of viewer 106 in the first orientation.
  • the ears of the listener are detected to be in a second orientation with respect to the plurality of speakers.
  • reference information generation circuitry 110 a and/or reference information generation circuitry 110 b detects that the ears of viewer 106 are in a second orientation with respect to speakers 130 a - 130 g.
  • second audio output is delivered to attempt to establish, with the spatial orientation of the audio content, the audio experience for the ears of the listener in the second orientation.
  • the spatial orientation of the audio content may be modified to accord with the ears of the listener being in the second orientation to provide the second audio output.
  • audio system 128 delivers the second audio output to attempt to establish the audio experience for the ears of viewer 106 in the second orientation.
  • a three-dimensional visual presentation that is tailored based on the listener being at a first location is delivered.
  • element(s) of an adaptable screen display may be controlled to deliver the three-dimensional visual presentation.
  • adaptable screen assembly 122 delivers the three-dimensional visual presentation that is tailored based on viewer 106 being at the first location.
  • a move by the listener to a second location is detected.
  • reference information generation circuitry 110 a and/or reference information generation circuitry 110 b detects a move by viewer 106 to the second location.
  • third audio output is delivered to establish, with the spatial orientation of the audio content, the audio experience for the ears of the listener at the second location.
  • the spatial orientation of the audio content may be modified to accord with the ears of the listener being at the second location to provide the second audio output.
  • audio system 128 delivers the third audio output to establish the audio experience for the ears of viewer 106 at the second location.
  • a second three-dimensional visual presentation that is tailored based on the listener being at the second location is delivered.
  • element(s) of an adaptable screen display may be controlled to deliver the second three-dimensional visual presentation.
  • adaptable screen assembly 122 delivers the second three-dimensional visual presentation that is tailored based on viewer 106 being at the second location.
  • one or more steps 1402 , 1404 , 1406 , 1408 , 1410 , 1412 , 1414 , and/or 1416 of flowchart 1400 may not be performed. Moreover, steps in addition to or in lieu of steps 1402 , 1404 , 1406 , 1408 , 1410 , 1412 , 1414 , and/or 1416 may be performed.
  • FIG. 15 is a block diagram of a media system 1502 that simultaneously presents first three-dimensional content to a first viewer 1506 having a first viewing reference 1508 and second three-dimensional content to a second viewer 1536 having a second viewing reference 1538 in accordance with an embodiment.
  • media system 1502 includes an adaptable screen assembly 1522 , driver circuitry 1524 , processing circuitry 1526 , audio system 1528 , reference information generation circuitry 1510 a, reference information generation circuitry 1510 b, and reference information generation circuitry 1510 c.
  • media system 1502 operates to deliver light that includes one or more viewable images to a viewing area that includes first viewer 1506 and second viewer 1536 .
  • Media system 1502 also operates to deliver audio content that is associated with the one or more viewable images toward the viewing area.
  • Media system 1502 may include, for example and without limitation, a television, a projection system, a home theater system, a monitor, a computing device (e.g., desktop computer, laptop computer, tablet computer) or a handheld device (e.g., a cellular phone, smart phone, personal media player, personal digital assistant), wherein the computing device or handheld device has at least one attached or integrated display.
  • Display characteristics of adaptable screen assembly 1522 may be modified to simultaneously present a first three-dimensional view of first visual content to first viewer 1506 and a second three-dimensional view of second visual content to second viewer 1536 .
  • Adaptable screen assemblies and manners of operating the same that can achieve this are described in the aforementioned, incorporated U.S. patent application Ser. No. 12/845,461, filed on Jul.
  • Such display characteristics may include, but are not limited to, the configuration of one or more adaptable light manipulators, the manner in which images are mapped to display pixels in a pixel array, the distance between the pixel array and the adaptable light manipulator(s), the angular orientation of the adaptable light manipulator(s), and the like.
  • the adaptation of the display characteristics of adaptable screen assembly 1522 may be carried out, in part, by the sending coordinated drive signals to various elements (e.g., a non-uniform backlight generator, a pixel array and an adaptable light manipulator) that comprise adaptable screen assembly 1522 .
  • This function is performed by driver circuitry 1524 responsive to the receipt of control signals from processing circuitry 1526 .
  • driver circuitry 1524 responsive to the receipt of control signals from processing circuitry 1526 .
  • the manner in which such coordinated drive signals may be generated is described in the aforementioned, incorporated U.S. patent application Ser. No. ______ (Attorney Docket No. A05.01240000), filed on even date herewith and entitled “Coordinated Driving of Adaptable Light Manipulator, Backlighting and Pixel Array in Support of Adaptable 2D and 3D Displays.”
  • Audio system 1528 is configured to deliver first audio content 1532 that is associated with the first three-dimensional view of the first visual content for at least first viewer 1506 and second audio content 1542 that is associated with the second three-dimensional view of the second visual content for at least second viewer 1536 . Audio system 1528 delivers first audio content 1532 and second audio content 1542 in accordance with respective spatial orientations that are controlled by driver circuitry 1524 .
  • driver circuitry 1524 may modify a first spatial orientation of first audio content 1532 to take into consideration a change in orientation of first viewer 1506 with respect to sound sources that correspond to (e.g., are depicted in) the first three-dimensional view and/or changes in orientation of such sound sources with respect to first viewer 1506 .
  • sound sources may include a representation of second viewer 1536 in the context of the first three-dimensional view, for example.
  • driver circuitry 1524 may modify a second spatial orientation of second audio content 1542 to take into consideration a change in orientation of second viewer 1536 with respect to sound sources that correspond to (e.g., are depicted in) the second three-dimensional view and/or changes in orientation of such sound sources with respect to second viewer 1536 .
  • Such sound sources may include a representation of first viewer 1506 in the context of the second three-dimensional view, for example.
  • Driver circuitry 1524 modifies the first spatial orientation of first audio content 1532 and/or the second spatial orientation of second audio content 1542 responsive to the receipt of control signals from processing circuitry 1526 .
  • audio system 1528 includes first viewer-specific speakers 1544 (e.g., in a first headset or earbuds) worn or otherwise used by first viewer 1506 and second viewer-specific speakers 1546 (e.g., in a second headset or earbuds) worn or otherwise used by second viewer 1536 .
  • audio system 1528 may provide first audio content 1532 to first viewer 1506 via first viewer-specific speakers 1544
  • audio system 1528 may provide second audio content 1542 to second viewer 1536 via second viewer-specific speakers 1546 . Accordingly, first viewer 1506 may hear first audio content 1532 without hearing second audio content 1542 , and second viewer 1536 may hear second audio content 1542 without hearing first audio content 1532 .
  • audio system 1528 includes viewer-agnostic speakers (e.g., speakers mounted around a room in which first and second viewers 1506 and 1536 are located) and second viewer-specific speakers 1546 .
  • audio system 1528 may provide first audio content 1532 to both first viewer 1506 and second viewer 1536 via the viewer-agnostic speakers, and audio system 1528 may provide second audio content 1542 to second viewer 1536 via second viewer-specific speakers 1546 . Accordingly, first viewer 1506 may hear first audio content 1532 without hearing second audio content 1542 , and second viewer 1536 may hear both first audio content 1532 and second audio content 1542 .
  • second viewer 1536 may hear second audio content 1542 but not first audio content 1532 if noise cancellation techniques are employed to hinder second viewer 1536 from perceiving first audio content 1532 .
  • Audio system 1528 may or may not include first viewer-specific speakers 1544 for providing the first audio content 1532 to first viewer 1506 , in addition to the viewer-agnostic speakers providing the first audio content 1532 to both first viewer 1506 and second viewer 1536 .
  • audio system 1528 includes viewer-agnostic speakers but no viewer-specific speakers (e.g., viewer-specific speakers 1544 and 1546 ).
  • reference information generation circuitry 1510 a, 1510 b, and 1510 c may operate in conjunction to determine a reference orientation that is based on the orientation of first viewer 1506 and the orientation of second viewer 1536 .
  • the reference orientation may be an average of the orientation of first viewer 1506 and the orientation of second viewer 1536 .
  • processing circuitry 1526 may control driver circuitry 1524 to modify the first spatial orientation of the first audio content 1532 and the second spatial orientation of the second audio content 1542 in accordance with the reference orientation to provide resulting audio content that is based on the reference orientation. Accordingly, audio system 1528 may provide the resulting audio content to both first viewer 1506 and second viewer 1536 . First viewer 1506 and second viewer 1536 both may hear the resulting audio content. It will be recognized that audio system 1528 may provide first audio content 1532 and second audio content 1542 via any combination of viewer-specific and/or viewer-agnostic speakers.
  • Reference information generation circuitry 1510 a and 1510 c comprise components of media system 1502 that operate in conjunction to produce first reference information concerning at least one positional characteristic of first viewing reference 1508 (i.e., orientation) of first viewer 1506 with respect to adaptable screen assembly 1522 .
  • Reference information generation circuitry 1510 b and 1510 c comprise components of media system 1502 that operate in conjunction to produce second reference information concerning at least one positional characteristic of second viewing reference 1538 (i.e., orientation) of second viewer 1536 with respect to adaptable screen assembly 1522 .
  • First viewing reference 1508 comprises one or more positional characteristics that affect how first three-dimensional visual content displayed via adaptable screen assembly 1522 and/or first audio content 1532 provided by audio system 1528 will be perceived by first viewer 1506 .
  • Second viewing reference 1538 comprises one or more positional characteristics that affect how second three-dimensional visual content simultaneously displayed via adaptable screen assembly 1522 and/or second audio content 1542 simultaneously provided by audio system 1528 will be perceived by second viewer 1536 .
  • Example positional characteristics of a viewing reference were described above.
  • the first reference information produced by reference information generation circuitry 1510 a and 1510 c is provided to processing circuitry 1526 . Based on at least the first reference information, processing circuitry 1526 issues one or more first control signals to driver circuitry 1524 to modify at least one of the display characteristics of adaptable screen assembly 1522 and/or to modify the spatial orientation of first audio content 1532 . Such modifications may be performed, for example, to deliver the first three-dimensional visual content and/or first audio content 1532 to first viewer 1506 in accordance with one or more positional characteristics of first viewing reference 1508 .
  • the second reference information produced by reference information generation circuitry 1510 b and 1510 c is also provided to processing circuitry 1526 .
  • processing circuitry 1526 issues one or more second control signals to driver circuitry 1524 to modify at least one of the display characteristics of adaptable screen assembly 1522 and/or to modify the spatial orientation of second audio content 1542 .
  • Such modifications may be performed, for example, to deliver the second three-dimensional visual content and/or second audio content 1542 to second viewer 1536 in accordance with one or more positional characteristics of second viewing reference 1538 .
  • Reference information generation circuitry 1510 a is intended to represent viewer-located circuitry that is situated on or near first viewer 1506 while reference information generation circuitry 1510 b is intended to represent viewer-located circuitry that is situated on or near second viewer 1536 .
  • Reference information generation circuitry 1510 a and 1510 b may include any of the components of reference information generation circuitry 110 a described above in reference to FIGS. 1 and 3 - 8 .
  • Reference information generation circuitry 1510 c is intended to represent circuitry that is not viewer-located. Reference information generation circuitry 1510 c is configured to interact with reference information generation circuitry 1510 a to determine one or more positional characteristics of first viewing reference 1508 . Such interaction may involve for example, implementing any of the techniques described above in reference to FIGS. 1 and 3 - 8 to estimate a location, head orientation and/or point of gaze of viewer 1506 . Reference information generation circuitry 1510 c is further configured to interact with reference information generation circuitry 1510 b to determine one or more positional characteristics of second viewing reference 1538 . Such interaction may involve for example, implementing any of the techniques described above in reference to FIGS.
  • reference information generation circuitry 1510 a, 1510 b and 1510 c can produce reference information about both viewing references 1508 and 1538 . Such information can be used by control circuitry to optimize the delivery of the first three-dimensional content to first viewer 1506 and the simultaneous delivery of the second three-dimensional content to second viewer 1536 .
  • FIG. 16 depicts a flowchart 1600 of a method for simultaneously presenting first three-dimensional content to a first viewer and second three-dimensional content to a second viewer in accordance with an embodiment.
  • the method of flowchart 1600 will be described herein with continued reference to media system 1502 of FIG. 15 . However, the method is not limited to that system.
  • the method of flowchart 1600 begins at step 1602 , in which first circuitry at least assists in producing first reference information corresponding to at least one positional characteristic of a first viewing reference of a first viewer.
  • the first circuitry may comprise, for example, reference information generation circuitry 1510 a and/or reference information generation circuitry 1510 c as described above.
  • the first viewing reference of the first viewer may comprise any of a number of positional characteristics that affect how first three-dimensional visual content displayed via an adaptable screen assembly and/or associated first audio that is provided by an audio system will be perceived by the first viewer.
  • positional characteristics may include, for example and without limitation, a position or location of the first viewer relative to the adaptable screen assembly, a head orientation of the first viewer and/or a point of gaze of the first viewer.
  • the first reference information may be produced using any of the approaches previously described herein as well as additional approaches not described herein.
  • step 1604 the first reference information produced during step 1602 is provided to second circuitry.
  • the second circuitry issues one or more first control signals to cause modification of at least one of one or more adaptable display characteristics of an adaptable screen assembly based on at least the first reference information.
  • the second circuitry may comprise, for example, processing circuitry 1526 as described above.
  • the one or more adaptable display characteristics may include, but are not limited to, a configuration of one or more adaptable light manipulators that form part of the adaptable screen assembly, a manner in which images are mapped to display pixels in a pixel array that forms part of the adaptable screen assembly, a distance between such pixel array and such adaptable light manipulator(s), an angular orientation of such adaptable light manipulator(s), and the like.
  • the second circuitry issues one or more second control signals to cause modification of a spatial orientation of first audio content for at least the first viewer based on at least the first reference information.
  • third circuitry at least assists in producing second reference information corresponding to at least one positional characteristic of a second viewing reference of a second viewer.
  • the third circuitry may comprise, for example, reference information generation circuitry 1510 b and/or reference information generation circuitry 1510 c as described above.
  • the second viewing reference of the second viewer may comprise any of a number of positional characteristics that affect how second three-dimensional content that is simultaneously displayed with the first three-dimensional content by the adaptable screen assembly and/or associated second audio that is provided by the audio system will be perceived by the second viewer.
  • positional characteristics may include, for example and without limitation, a position or location of the second viewer relative to the adaptable screen assembly, a head orientation of the second viewer and/or a point of gaze of the second viewer.
  • the second reference information may be produced using any of the approaches previously described herein as well as additional approaches not described herein.
  • step 1612 the second reference information produced during step 1608 is provided to the second circuitry.
  • the second circuitry issues one or more third control signals to cause modification of at least one of the one or more adaptable display characteristics of the adaptable screen assembly based on at least the second reference information.
  • the second circuitry issues one or more fourth control signals to cause modification of a spatial orientation of second audio content for at least the second viewer based on at least the second reference information.
  • FIG. 17 depicts a flowchart 1700 of a method for delivering audio content to first and second viewers of a display capable of simultaneously presenting first video content to the first viewer and second video content to the second viewer.
  • the method of flowchart 1700 begins at step 1702 in which first audio content is delivered to first viewer-located circuitry (e.g., reference information generation circuitry 1510 a of FIG. 15 ) carried by the first viewer of the display, the first audio content being associated with the first video content.
  • the first viewer-located circuitry may include first viewer-specific speakers of an audio system (e.g., audio system 1528 ).
  • second audio content is simultaneously delivered to second viewer-located circuitry (e.g., reference information generation circuitry 1510 b of FIG. 15 ) carried by the second viewer of the display, the second audio content being associated with the second video content.
  • the second viewer-located circuitry may include second viewer-specific speakers of the audio system.
  • first and second viewers 1506 and 1536 may each view the same video content but customize the corresponding audio content that is being delivered thereto in one or more ways including but not limited to content (e.g., choice of language) and audio settings (e.g., volume, mono vs. stereo sound, two-dimensional vs. three-dimensional audio, equalizer settings or the like).
  • content e.g., choice of language
  • audio settings e.g., volume, mono vs. stereo sound, two-dimensional vs. three-dimensional audio, equalizer settings or the like.
  • the customizations applied to the audio content delivered to first viewer 1506 need not be applied to the audio content delivered to second viewer 1536 and vice versa.
  • FIG. 18 is a block diagram of an example implementation of a media system, such as media system 102 described above in reference to FIG. 1 and media system 1502 described above in reference to FIG. 15 , in accordance with an embodiment.
  • media system 1800 generally comprises processing circuitry 1802 , driver circuitry 1804 , a screen assembly 1806 , reference information generation circuitry 1808 , and audio system 1858 .
  • processing circuitry 1802 includes a processing unit 1814 , which may comprise one or more general-purpose or special-purpose processors or one or more processing cores.
  • Processing unit 1814 is connected to a communication infrastructure 1812 , such as a communication bus.
  • Processing circuitry 1802 may also include a primary or main memory (not shown in FIG. 18 ), such as random access memory (RAM), that is connected to communication infrastructure 1812 .
  • the main memory may have control logic stored thereon for execution by processing unit 1814 as well as data stored thereon that may be input to or output by processing unit 1814 during execution of such control logic.
  • Processing circuitry 1802 may also include one or more secondary storage devices (not shown in FIG. 18 ) that are connected to communication infrastructure 1812 , including but not limited to a hard disk drive, a removable storage drive (such as an optical disk drive, a floppy disk drive, a magnetic tape drive, or the like), or an interface for communicating with a removable storage unit such as an interface for communicating with a memory card, memory stick or the like.
  • secondary storage devices provide an additional means for storing control logic for execution by processing unit 1814 as well as data that may be input to or output by processing unit 1814 during execution of such control logic.
  • Processing circuitry 1802 further includes a user input interface 1818 , a reference information generation circuitry interface (I/F) 1816 , and a media interface 1820 .
  • User input interface 1818 is intended to generally represent any type of interface that may be used to receive user input, including but not limited to a remote control device, a traditional computer input device such as a keyboard or mouse, a touch screen, a gamepad or other type of gaming console input device, or one or more sensors including but not limited to video cameras, microphones and motion sensors.
  • Reference information generation circuitry interface 1816 is an interface that is suitable for connection to reference information generation circuitry 1808 and that allows processing circuitry 1802 to communicate therewith.
  • reference information generation circuitry 1808 comprises circuitry that is configured to generate information about one or more positional characteristics of one or more viewing references associated with one or more viewers of media system 1800 .
  • Media interface 1820 is intended to represent any type of interface that is capable of receiving media content such as video content or image content.
  • media interface 1820 may comprise an interface for receiving media content from a remote source such as a broadcast media server, an on-demand media server, or the like.
  • media interface 1820 may comprise, for example and without limitation, a wired or wireless internet or intranet connection, a satellite interface, a fiber interface, a coaxial cable interface, or a fiber-coaxial cable interface.
  • Media interface 1820 may also comprise an interface for receiving media content from a local source such as a DVD or Blu-Ray® disc player, a personal computer, a personal media player, smart phone, or the like.
  • Media interface 1820 may be capable of retrieving video content from multiple sources.
  • Processing circuitry 1802 further includes a communication interface 1822 .
  • Communication interface 1822 enables processing circuitry 1802 to send control signals via a communication medium 1852 to another communication interface 1830 within driver circuitry 1804 , thereby enabling processing circuitry 1802 to control the operation of driver circuitry 1804 .
  • Communication medium 1852 may comprise any kind of wired or wireless communication medium suitable for transmitting such control signals.
  • driver circuitry 1804 includes the aforementioned communication interface 1830 as well as pixel array driver circuitry 1832 , adaptable light manipulator driver circuitry 1834 , and speaker driver circuitry 1854 .
  • Driver circuitry 1804 also optionally includes light generator driver circuitry 1836 .
  • Each of pixel array driver circuitry 1832 , adaptable light manipulator driver circuitry 1834 , and light generator driver circuitry 1836 is configured to receive control signals from processing circuitry 1802 (via the link between communication interface 1822 and communication interface 1830 ) and, responsive thereto, to send selected drive signals to a corresponding hardware element within screen assembly 1806 , the drive signals causing the corresponding hardware element to operate in a particular manner.
  • pixel array driver circuitry 1832 is configured to send selected drive signals to a pixel array 1842 within screen assembly 1806
  • adaptable light manipulator driver circuitry 1834 is configured to send selected drive signals to an adaptable light manipulator 1844 within screen assembly 1806
  • optional light generator driver circuitry 1836 is configured to send selected drive signals to an optional light generator 1846 within screen assembly 1806 .
  • Driver circuitry 1804 also includes speaker driver circuitry 1854 , which is configured to receive control signals from processing circuitry 1802 (via the link between communication interface 1822 and communication interface 1830 ) and, responsive thereto, to send selected drive signals to speakers 1860 within audio system 1858 , the drive signals causing speakers 1860 to provide audio content having a specified spatial configuration.
  • processing unit 1814 operates pursuant to control logic to receive visual and/or audio content via media interface 1820 and to generate control signals necessary to cause driver circuitry 1804 to render the visual content to screen assembly 1806 and/or the audio content to audio system 1858 in accordance with a selected viewing configuration.
  • the viewing configuration may be selected based on, for example, reference information generated by and received from reference information generation circuitry 1808 .
  • the control logic that is executed by processing unit 1814 may be retrieved, for example, from a primary memory or a secondary storage device connected to processing unit 1814 via communication infrastructure 1812 as discussed above.
  • the control logic may also be retrieved from some other local or remote source. Where the control logic is stored on a computer readable medium, that computer readable medium may be referred to herein as a computer program product.
  • driver circuitry 1804 may be controlled in a manner described in aforementioned, incorporated U.S. patent application Ser. No. ______ (Attorney Docket No. A05.01240000), filed on even date herewith and entitled “Coordinated Driving of Adaptable Light Manipulator, Backlighting and Pixel Array in Support of Adaptable 2D and 3D Displays” (the entirety of which is incorporated by reference herein) to send coordinated drive signals necessary for displaying two-dimensional content and three-dimensional content via screen assembly 1806 . In certain operating modes, such content may be simultaneously displayed via different display regions of screen assembly 1806 .
  • pixel array 1842 adaptable light manipulator 1844 (e.g., an adaptable parallax barrier), and light generator 1846 may be manipulated in a coordinated fashion to perform this function was described in the patent application referenced immediately above. It will be recognized that speakers 1860 may be controlled in like manner to provide a coordinated three-dimensional audio and visual experience. Note that in accordance with certain implementations (e.g., implementations in which pixel array comprises a OLED/PLED pixel array), screen assembly 1806 need not include light generator 1846 .
  • At least part of the function of generating control signals necessary to cause pixel array 1842 , adaptable light manipulator 1844 and light generator 1846 to render visual content and/or to cause speakers 1860 to render audio content in accordance with a selected viewing configuration is performed by drive signal processing circuitry 1838 which is integrated within driver circuitry 1804 .
  • drive signal processing circuitry 1838 which is integrated within driver circuitry 1804 .
  • Such circuitry may operate, for example, in conjunction with and/or under the control of processing unit 1814 to generate the necessary control signals.
  • processing circuitry 1802 , driver circuitry 1804 , screen elements 1806 , and potentially at least some portion of audio system 1858 are all included within a single housing.
  • all these elements may exist within a television, a laptop computer, a tablet computer, or a telephone.
  • link 1850 formed between communication interfaces 1822 and 1830 may be replaced by a direct connection between driver circuitry 1804 and communication infrastructure 1812 .
  • processing circuitry 1802 and potentially at least some portion of audio system 1858 are disposed within a first housing, such as set top box or personal computer, and driver circuitry 1804 , screen assembly 1806 , and potentially at least some portion of audio system 1858 are disposed within a second housing, such as a television or computer monitor.
  • audio system 1858 may be disposed within at least one third housing, such as a plurality of speaker assemblies located around a viewing area.
  • the set top box may be any type of set top box including but not limited to fiber, Internet, cable, satellite, or terrestrial digital.
  • the various processing circuitry elements 126 , 1526 , and 1802 shown in underlying figures may exist in whole or in part in one or more of any one or more media environment devices.
  • a media environment device include but are not limited to a home set top box, a location support unit, a gateway, an access point, a media player (e.g., a DVD, CD, or Blu-Ray player), a projection system, a display device (e.g., a television, a monitor, a personal computer, a phone, etc.), etc.
  • Such processing circuitry attempts to identify a synchronized 3D viewing and listening experience based on reference information concerning at least one positional characteristic of a viewing and listening reference (i.e., orientation) of a viewer, as described above. Identification of a synchronized 3D viewing and listening experience may be performed in a variety of ways. For example, appropriate camera views may be selected to be displayed. For instance, a 3D8 data set might drive a 3D2 display wherein particular pairs of camera perspectives/views are selected based on the viewer's reference information. In another example, interpolation might be applied to generate interpolated views. For instance, interpolation may be applied to generate interpolated views in 3D2 data sets to produce perhaps four distinct camera views (wherein two of the camera views are interpolations). Lastly for the video, the light manipulator itself in a 3D8 display might provide the 3D visual experience with a changing eyes reference point.
  • a 3D audio experience may change in synch with a changing 3D visual experience in a variety of ways.
  • the processing circuitry might smoothly migrate between a plurality of audio channel sets that each correspond to one of the plurality of visual reference points.
  • An audio channel set may be, for example, a Dolby 5.1 set, wherein each set is captured or produced for a particular reference point that may correspond to that of the various camera views.
  • audio sets for each specified noise source having a 3D origin can be captured or produced independently.
  • a general background set e.g., music, etc.
  • a distant explosion may be captured/produced from 4 reference points
  • local speech from an on-screen relatively close actor might be captured/produced with 8 reference points—all in a 3D2 or a 3D4 environment.
  • both a different 3D visual experience and a different 3D listening experience i.e., the 3D sensory environment

Abstract

Techniques are described herein for supporting 3D audio delivery accompanying 3D display supported by viewer/listener position and orientation tracking. For example, audio content is configured to have a spatial orientation that accords with an orientation of a viewer. A spatial orientation of audio content is a configuration of the audio content in which characteristics of respective portions of the audio content, which correspond to respective speakers, indicate an orientation of ears of a viewer/listener with respect to sound source(s) that correspond to (e.g., are depicted in) a three-dimensional view. Such a characteristic may include an amplitude of sound that corresponds to a sound source and/or a delay associated with the sound that corresponds to the sound source.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of U.S. Provisional Application No. 61/291,818, filed on Dec. 31, 2009, which is incorporated by reference herein in its entirety.
  • This application also claims the benefit of U.S. Provisional Application No. 61/303,119, filed on Feb. 10, 2010, which is incorporated by reference herein in its entirety.
  • This application is also related to the following U.S. Patent Applications, each of which also claims the benefit of U.S. Provisional Patent Application Nos. 61/291,818 and 61/303,119 and each of which is incorporated by reference herein:
  • U.S. patent application Ser. No. 12/774,225, filed on May 5, 2010 and entitled “Controlling a Pixel Array to Support an Adaptable Light Manipulator”;
  • U.S. patent application Ser. No. 12/774,307, filed on May 5, 2010 and entitled “Display with Elastic Light Manipulator”;
  • U.S. patent application Ser. No. 12/845,409, filed on Jul. 28, 2010, and entitled “Display with Adaptable Parallax Barrier”;
  • U.S. patent application Ser. No. 12/845,440, filed on Jul. 28, 2010, and entitled “Adaptable Parallax Barrier Supporting Mixed 2D and Stereoscopic 3D Display Regions”;
  • U.S. patent application Ser. No. 12/845,461, filed on Jul. 28, 2010, and entitled “Display Supporting Multiple Simultaneous 3D Views”;
  • U.S. patent application Ser. No. ______ (Attorney Docket. No. A05.01210000), filed on even date herewith and entitled “Backlighting Array Supporting Adaptable Parallax Barrier”;
  • U.S. patent application Ser. No. ______ (Attorney Docket No. A05.01240000), filed on even date herewith and entitled “Coordinated Driving of Adaptable Light Manipulator, Backlighting and Pixel Array in Support of Adaptable 2D and 3D Displays”;
  • U.S. patent application Ser. No. ______ (Attorney Docket No. A05.01390000), filed on even date herewith and entitled “Three-Dimensional Display System With Adaptation Based on Viewing Reference of Viewer(s)”;
  • U.S. patent application Ser. No. ______ (Attorney Docket No. A05.01400000), filed on even date herewith and entitled “Remote Control with Integrated Position, Viewer Identification and Optical and Audio Test”; and
  • U.S. patent application Ser. No. ______ (Attorney Docket No. A05.01420000), filed on even date herewith and entitled “Multiple Remote Controllers that Each Simultaneously Controls a Different Visual Presentation of a 2D/3D Display.”
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention generally relates to techniques for delivering audio to a listener based on an orientation of the listener.
  • 2. Background Art
  • Images may be generated for display in various forms. For instance, television (TV) is a widely used telecommunication medium for transmitting and displaying images in monochromatic (“black and white”) or color form. Conventionally, images are provided in analog form and are displayed by display devices in two-dimensions. More recently, images are being provided in digital form for display in two-dimensions on display devices having improved resolution (e.g., “high definition” or “HD”). Even more recently, images capable of being displayed in three-dimensions are being generated.
  • Conventional displays may use a variety of techniques to achieve three-dimensional image viewing functionality. For example, various types of glasses have been developed that may be worn by users to view three-dimensional images displayed by a conventional display. Examples of such glasses include glasses that utilize color filters or polarized filters. In each case, the lenses of the glasses pass two-dimensional images of differing perspective to the user's left and right eyes. The images are combined in the visual center of the brain of the user to be perceived as a three-dimensional image. In another example, synchronized left eye, right eye LCD (liquid crystal display) shutter glasses may be used with conventional two-dimensional displays to create a three-dimensional viewing illusion. In still another example, LCD display glasses are being used to display three-dimensional images to a user. The lenses of the LCD display glasses include corresponding displays that provide images of differing perspective to the user's eyes, to be perceived by the user as three-dimensional.
  • Problems exist with such techniques for viewing three-dimensional images. For instance, persons that use such displays and systems to view three-dimensional images may suffer from headaches, eyestrain, and/or nausea after long exposure. Furthermore, some content, such as two-dimensional text, may be more difficult to read and interpret when displayed three-dimensionally. To address these problems, some manufacturers have created display devices that may be toggled between three-dimensional viewing and two-dimensional viewing. A display device of this type may be switched to a three-dimensional mode for viewing of three-dimensional images, and may be switched to a two-dimensional mode for viewing of two-dimensional images (and/or to provide a respite from the viewing of three-dimensional images).
  • A parallax barrier is another example of a device that enables images to be displayed in three-dimensions. A parallax barrier includes a layer of material with a series of precision slits. The parallax barrier is placed proximal to a display so that a user's eyes each see a different set of pixels to create a sense of depth through parallax. A disadvantage of parallax barriers is that the viewer must be positioned in a well-defined location in order to experience the three-dimensional effect. If the viewer moves his/her eyes away from this “sweet spot,” image flipping and/or exacerbation of the eyestrain, headaches and nausea that may be associated with prolonged three-dimensional image viewing may result. Conventional three-dimensional displays that utilize parallax barriers are also constrained in that the displays must be entirely in a two-dimensional image mode or a three-dimensional image mode at any time.
  • One common technique for improving an audio experience of viewers of two-dimensional display devices is referred to as “surround sound”. Surround sound provides different audio content through different audio channels in an effort to provide a fixed or forward perspective of a sound field to a viewer/listener at a fixed location (e.g., the aforementioned “sweet spot”). The audio channels correspond to locations of speakers that surround the viewer (e.g., right, left, front, back, etc.). However, surround sound has its limitations, especially when applied with respect to three-dimensional display devices. For example, if a viewer moves away from the fixed location, a degradation of the viewer's audio experience may result. For instance, as a viewer moves about a room, the sound that is projected from a speaker may be perceived as becoming louder as the viewer nears that speaker. The sound that is projected from that speaker may be perceived as becoming quieter as the viewer moves away from the speaker. If the viewer is wearing a viewer-specific audio device (e.g., headphones or earbuds), movements by the viewer may have no effect on the viewer's perception of the sounds that are projected from speakers in the viewer-specific device.
  • BRIEF SUMMARY OF THE INVENTION
  • Methods, systems, and apparatuses are described for supporting 3D audio delivery accompanying 3D display supported by viewer/listener position and orientation tracking as shown in and/or described herein in connection with at least one of the figures, as set forth more completely in the claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS/FIGURES
  • The accompanying drawings, which are incorporated herein and form a part of the specification, illustrate the present invention and, together with the description, further serve to explain the principles of the invention and to enable a person skilled in the pertinent art to make and use the invention.
  • FIG. 1A is a block diagram of a media system in accordance with an embodiment.
  • FIG. 1B shows an exemplary content capturing system in accordance with an embodiment.
  • FIG. 2 shows an exemplary viewer-located implementation of an audio system shown in FIG. 1 in accordance with an embodiment.
  • FIG. 3 is a block diagram of a media system in accordance with a first embodiment that includes reference information generation circuitry that implements a triangulation technique to determine an estimated location of a viewer.
  • FIG. 4 is a block diagram of a media system in accordance with a second embodiment that includes reference information generation circuitry that implements a triangulation technique to determine an estimated location of a viewer.
  • FIG. 5 is a block diagram of a media system in accordance with an embodiment that includes reference information generation circuitry that implements an infrared (IR) distance measurement system to help determine an estimated location of a viewer.
  • FIG. 6 is a block diagram of a media system in accordance with an embodiment that includes information generation circuitry that implements a magnetic field detection system to help determine an estimated location of viewer.
  • FIG. 7 is a block diagram of a media system in accordance with an embodiment that includes viewer-located reference information generation circuitry that includes one or more cameras and one or more microphones for facilitating the generation of reference information corresponding to at least one positional characteristic of a viewing reference of a viewer.
  • FIG. 8 is a block diagram of a media system in accordance with an embodiment that includes reference information generation circuitry that includes a head orientation sensor and eye tracking circuitry for determining a head orientation and point of gaze, respectively, of a viewer.
  • FIG. 9 is a block diagram of a media system in accordance with an embodiment in which non-viewer-located camera(s) and/or microphone(s) operate to generate reference information corresponding to at least one positional characteristic of a viewing reference of a viewer.
  • FIG. 10 depicts a headset in accordance with an embodiment that includes reference information generation circuitry for facilitating the generation of reference information corresponding to at least one positional characteristic of a viewing reference of a viewer.
  • FIG. 11A depicts an embodiment in which reference information generation circuitry is distributed among a headset and a remote control that are connected to each other by a wired communication link.
  • FIG. 11B depicts an embodiment in which reference information generation circuitry is distributed among a headset and a laptop computer that are connected to each other by a wireless communication link.
  • FIG. 12 depicts a flowchart of a method for presenting three-dimensional content to a viewer having a viewing reference in accordance with an embodiment, wherein the manner in which such content is presented is controlled in accordance with reference information concerning the viewing reference.
  • FIG. 13 depicts a flowchart of a method for delivering video output and audio output to a viewer based at least in part on positional characteristic(s) relating to an orientation of the viewer in accordance with an embodiment.
  • FIG. 14 depicts a flowchart of a method for delivering an audio experience for ears of a listener via a plurality of speakers in accordance with an embodiment.
  • FIG. 15 is a block diagram of a media system in accordance with an embodiment that simultaneously presents first three-dimensional content to a first viewer having a first viewing reference and second three-dimensional content to a second viewer having a second viewing reference, wherein the manner in which such content is displayed is controlled in accordance with reference information concerning the first and second viewing references.
  • FIG. 16 depicts a flowchart of a method for presenting first three-dimensional content to a first viewer having a first viewing reference and simultaneously presenting second three-dimensional content to a second viewer having a second reference in accordance with an embodiment, wherein the manner in which such content is presented is controlled in accordance with reference information concerning the first and second viewing references.
  • FIG. 17 depicts a flowchart of a method for delivering audio content to first and second viewers of a display capable of simultaneously presenting first video content to the first viewer and second video content to the second viewer in accordance with an embodiment.
  • FIG. 18 is a block diagram of an example implementation of an adaptable two-dimensional/three-dimensional media system in accordance with an embodiment.
  • The present invention will now be described with reference to the accompanying drawings. In the drawings, like reference numbers indicate identical or functionally similar elements. Additionally, the left-most digit(s) of a reference number identifies the drawing in which the reference number first appears.
  • DETAILED DESCRIPTION OF THE INVENTION I. Introduction
  • The present specification discloses one or more embodiments that incorporate the features of the invention. The disclosed embodiment(s) merely exemplify the invention. The scope of the invention is not limited to the disclosed embodiment(s). The invention is defined by the claims appended hereto.
  • References in the specification to “one embodiment,” “an embodiment,” “an example embodiment,” etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to effect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
  • Furthermore, it should be understood that spatial descriptions (e.g., “above,” “below,” “up,” “left,” “right,” “down,” “top,” “bottom,” “vertical,” “horizontal,” etc.) used herein are for purposes of illustration only, and that practical implementations of the structures described herein can be spatially arranged in any orientation or manner.
  • II. Example Three-Dimensional Media Systems with Adaptation Based on Viewer Orientation
  • FIG. 1 is a block diagram of a media system 102 that presents three-dimensional content to a viewer 106 having a viewing reference 108 (i.e., an orientation) in accordance with an embodiment. As shown in FIG. 1, media system 102 includes an adaptable screen assembly 122, driver circuitry 124, processing circuitry 126, reference information generation circuitry 110 a, and reference information generation circuitry 110 b.
  • Generally speaking, media system 102 operates to deliver light that includes one or more viewable images to a viewing area that includes viewer 106. Media system 102 also operates to deliver audio content that is associated with the one or more viewable images toward viewer 106. Media system 102 may include, for example and without limitation, a television, a projection system, a home theater system, a monitor, a computing device (e.g., desktop computer, laptop computer, tablet computer) or a handheld device (e.g., a cellular phone, smart phone, personal media player, personal digital assistant), wherein the computing device or handheld device has at least one attached or integrated display.
  • Adaptable screen assembly 122 is designed such that certain display characteristics associated therewith can be modified to support multiple viewing modes. For example, certain display characteristics associated with adaptable screen assembly 122 may be modified to selectively present images in a two-dimensional viewing mode or one or more three-dimensional viewing modes. For example, in certain implementations, display characteristics associated with screen assembly 122 may be modified to display a single image of certain subject matter to provide a two-dimensional view thereof, to display two images of the same subject matter viewed from different perspectives in a manner that provides a single three-dimensional view thereof, or to display a multiple of two images (e.g., four images, eight images, etc.) of the same subject matter viewed from different perspectives in a manner that simultaneously provides multiple three-dimensional views thereof, wherein the particular three-dimensional view perceived by a viewer is dependent at least in part upon the position of the viewer (also referred to herein as a “multi-view three-dimensional viewing mode”).
  • Various examples of adaptable screen assemblies that can be modified to support such two-dimensional and three-dimensional viewing modes are described in the following commonly-owned, co-pending U.S. Patent Applications: U.S. patent application Ser. No. 12/774,307, filed on May 5, 2010 and entitled “Display with Elastic Light Manipulator”; U.S. patent application Ser. No. 12/845,409, filed on Jul. 28, 2010 and entitled “Display with Adaptable Parallax Barrier”; and U.S. patent application Ser. No. 12/845,461, filed on Jul. 28, 2010 and entitled “Display Supporting Multiple Simultaneous 3D Views.” The entirety of each of these applications is incorporated by reference herein. Adaptable screen assembly 122 may be implemented in accordance with descriptions provided in the above-referenced applications.
  • In addition to the foregoing capabilities, adaptable screen assembly 122 may also be capable of simultaneously presenting two dimensional views and three-dimensional views in different regions of the same screen, respectively. By way of example, adaptable screen assembly 122 may be capable of simultaneously presenting a two-dimensional view of first visual content in a first region of a screen, and one or more three-dimensional views of second visual content in a second region of the screen. Adaptable screen assemblies having such capabilities are described in commonly-owned, co-pending U.S. patent application Ser. No. 12/845,440, filed on Jul. 28, 2010, and entitled “Adaptable Parallax Barrier Supporting Mixed 2D and Stereoscopic 3D Display Regions,” the entirety of which is incorporated by reference herein.
  • A display characteristic of adaptable screen assembly 122 that may be modified to switch between different full-screen and regional two-dimensional and three-dimensional viewing modes may include a configuration of an adaptable light manipulator such as an adaptable parallax barrier. An adaptable lenticular lens may also be used as an adaptable light manipulator to switch between different full-screen three-dimensional viewing modes. Descriptions of such adaptable light manipulators and methods for dynamically modifying the same may be found in the aforementioned, incorporated U.S. patent application Ser. No. 12/774,307, filed on May 5, 2010 and entitled “Display with Elastic Light Manipulator” and U.S. patent application Ser. No. 12/845,409, filed on Jul. 28, 2010 and entitled “Display with Adaptable Parallax Barrier.” For example, the degree of stretching of an adaptable lenticular lens may be modified in order to support certain three-dimensional viewing modes. As another example, barrier elements of an adaptable parallax barrier may be selectively placed in a blocking or non-blocking state in order to support certain full-screen and regional two-dimensional and three-dimensional viewing modes.
  • Another display characteristic of adaptable screen assembly 122 that may be modified to switch between different full-screen and regional two-dimensional and three-dimensional viewing modes may include the manner in which image content is mapped to display pixels of a pixel array, as described in commonly-owned, co-pending U.S. patent application Ser. No. 12/774,225, filed on May 5, 2010 and entitled “Controlling a Pixel Array to Support an Adaptable Light Manipulator,” the entirety of which is incorporated by reference herein. Yet another display characteristic that may be modified to achieve such switching includes the manner in which backlighting is generated by a backlighting array or other non-uniform light generation element, as described in commonly-owned, co-pending U.S. patent application Ser. No. ______ (Attorney Docket. No. A05.01210000), filed on even date herewith and entitled “Backlighting Array Supporting Adaptable Parallax Barrier,” the entirety of which is incorporated by reference herein.
  • The adaptation of the display characteristics of adaptable screen assembly 122 may be carried out, in part, by sending coordinated drive signals to various elements (e.g., a non-uniform backlight generator, a pixel array and/or an adaptable light manipulator) that are included in adaptable screen assembly 122. This function is performed by driver circuitry 124 responsive to the receipt of control signals from processing circuitry 126. A manner in which such coordinated drive signals may be generated is described in U.S. patent application Ser. No. ______ (Attorney Docket No. A05.01240000), filed on even date herewith and entitled “Coordinated Driving of Adaptable Light Manipulator, Backlighting and Pixel Array in Support of Adaptable 2D and 3D Displays.”
  • Audio system 128 includes a plurality of speakers 130 a-130 g that provide respective portions 132 a-132 g of audio output based on underlying audio content that is associated with the visual output based on underlying video content that adaptable screen assembly 122 delivers to viewer 106. Audio system 128 provides the portions 132 a-132 g in accordance with a spatial orientation of the audio content that is controlled by driver circuitry 124. A spatial orientation of audio content is a configuration of the audio content in which characteristics of respective portions (e.g., portions 132 a-132 g) of the audio content, which correspond to respective speakers (e.g., speakers 130 a-130 g), indicate an orientation of ears of a viewer/listener (e.g., viewer 106) with respect to sound source(s) that correspond to (e.g., are depicted in) a three-dimensional view. Such a characteristic may include an amplitude of sound that corresponds to a sound source and/or delays that are associated with the sound that corresponds to the sound source and associated reflections thereof.
  • As mentioned above, adaptable screen assembly 122 may deliver a three-dimensional view to viewer 106. Driver circuitry 124 and speakers 130 a-130 g may operate in conjunction to provide portions 132 a-132 g of audio content that are associated with the three-dimensional view. For example, driver circuitry 124 may modify a spatial orientation of the audio content to take into consideration a change in orientation of viewer 106 with respect to sound sources that correspond to (e.g., are depicted in) the three-dimensional view and/or changes in orientation of such sound sources with respect to viewer 106. Driver circuitry 124 modifies the spatial orientation of the audio content responsive to the receipt of control signals from processing circuitry 126.
  • For example, the audio content may include a first audio portion 132 a that corresponds to a right side speaker 130 a and a second audio portion 132 b that corresponds to a left side speaker 130 b. In accordance with this example, if viewer 106 moves to the right of a sound source in the context of the three-dimensional view, the spatial orientation of the audio content may be modified by decreasing an amplitude of a sound that corresponds to the sound source in the first audio portion 132 a and/or increasing a delay that is associated with the sound in the first audio portion 132 a. In addition or alternatively, the spatial orientation of the audio content may be modified by increasing an amplitude of the sound in the second audio portion 132 b and/or decreasing a delay that is associated with the sound in the second audio portion 132 b. Accordingly, the resulting spatial orientation of the audio content may provide an audio experience that is consistent with the viewer moving to the right of the sound source in a real-world environment.
  • For example in one embodiment, a first and second object, both objects being identical visually and audibly, might be presented in a three dimensional sensory environment via both the adaptable screen assembly 122 and the speakers 130 a-g. Therein, the first object might be presented as being much closer to the viewer 106 than the second object. As such, for the first object, the audio output between those of the speakers 130 a-g to the right of the viewer versus those to the left might provide first left side and first right side delays and amplitudes. Similarly, for the second object, the audio output might provide second left side and second right side delays and amplitudes. If the viewer 106 changes their point of reference, e.g., moves to look at something behind the first object, the first left and right delays and amplitudes associated with the first object will be adapted to produce a listening experience that corresponds to the visual change in reference and associated visual experience. The second left and right delays and amplitudes will be adapted differently because of the further location in the 3D visual experience of the second object. That is, because it is to be perceived as being far away, the visual experience will not change as much or as rapidly as the first object appears to change visually. Thus, the audio output portion corresponding to the second object (that producing the audible 3D listening experience associated with the second object) need not change as much or as fast either. In other words, each of a plurality of audio/video objects has a 3D position and orientation that adapts both visually and audibly to attempt to convince a viewer-listener that the experience is a real-world environment.
  • Audio system 128 is shown in FIG. 1 to include seven speakers 130 a-130 g for illustrative purposes and is not intended to be limiting. It will be recognized that audio system 128 may include any suitable number of speakers. Moreover, such speakers may be arranged in any suitable configuration. Audio system 128 may include viewer-specific speakers (e.g., speakers in headphones or earbuds) and/or viewer-agnostic speakers (e.g., speakers mounted on walls of a room or in a dashboard, door panels, etc. of a car). An example of a viewer-located implementation of audio system 128 that includes viewer-specific speakers is shown in FIG. 2, which is described below.
  • Reference information generation circuitry 110 a and 110 b comprise components of media system 102 that operate in conjunction to produce reference information concerning at least one positional characteristic of viewing and listening reference 108 (i.e., orientation) of viewer 106 with respect to adaptable screen assembly 122. Viewing and listening reference 108 may include any of a number of positional characteristics that affect how three-dimensional visual content displayed via adaptable screen assembly 122 and/or how audio content provided by audio system 128 will be perceived by viewer 106. Such positional characteristics may include, for example and without limitation, a position or location of viewer 106 relative to adaptable screen assembly 122, a head orientation of viewer 106, ear orientation of the viewer 106, and/or a point of gaze of viewer 106. The position or location of viewer 106 (and both the eyes and ears thereof) relative to adaptable screen assembly 122 may include a distance from adaptable screen assembly 122 or some reference point associated therewith, and such distance may include both horizontal distance and elevation. Furthermore, the position or location of viewer 106 may also include eye and ear locations of viewer 106. The head orientation of viewer 106 may include a degree of tilt and/or rotation of the head of viewer 106.
  • The reference information produced by reference information generation circuitry 110 a and 110 b is provided to control circuitry 124. Based on at least the reference information, processing circuitry 126 causes modification of at least one of the display characteristics of adaptable screen assembly 122 and/or modification of the spatial orientation of the audio content that is provided by audio system 128. Such modifications may be caused by causing appropriate drive signals to be generated by driver circuitry 124. Such modification may be performed, for example, to deliver a particular three-dimensional view to viewer 106 and/or tailored audio output based on underlying audio content associated therewith in accordance with one or more positional characteristics of viewing and listening reference 108. For example, such modifications may be performed to deliver a particular three-dimensional view and/or associated audio content to an estimated location of viewer 106 (including an eye location of viewer 106) and/or in an orientation that corresponds to an orientation of viewer 106. Thus, by producing and providing such reference information to processing circuitry 126, media system 102 is capable of delivering three-dimensional content and/or associated audio content to viewer 106 in an optimized manner.
  • Reference information generation circuitry 110 a is intended to represent viewer-located circuitry that is situated on or near viewer 106. For example, reference information generation circuitry 110 a may comprise circuitry that is incorporated into one or more portable devices or housings which are worn on or carried by viewer 106. Such portable devices or housings may include, but are not limited to, a headset, glasses, an earplug, a pendant, a wrist-mounted device, a remote control, a game controller, a handheld personal device (such as a cellular telephone, smart phone, personal media player, personal digital assistant or the like) and a portable computing device (such as a laptop computer, tablet computer or the like). Such viewer-located circuitry may be designed to leverage a proximity to the user to assist in generating the above-described reference information.
  • Reference information generation circuitry 110 b is intended to represent circuitry that is not viewer-located. As will be discussed in reference to particular embodiments described herein, reference information generation circuitry 110 b is configured to operate in conjunction with reference information generation circuitry 110 a to generate the above-described reference information and to provide such reference information to processing circuitry 126.
  • Reference information generation circuitry 110 b is shown in FIG. 1A to include first stationary location support circuitry (SLSC) 134 a and second SLSC 134 b for illustrative purposes, though the embodiments are not limited in this respect. First and second SLSCs 134 a and 134 b are configured to support production of the reference information described above.
  • In one embodiment, first and second SLSCs 134 a and 134 b exchange (either send or receive based on the embodiment) with a viewer positioned device (VPD) to assist in generating location information (trilateration, triangulation, etc.) regarding viewer 106. For instance, the VPD may be included in reference information generation circuitry 110 a. At least a portion of the first and second SLSCs 134 a and 134 b may be within separate housings with a communication link back toward processing circuitry 126.
  • In another embodiment, only one SLSC (e.g., first SLSC 134 a or second SLSC 134 b) housed with processing circuitry 126 might be used. For example, such a SLSC may capture images of the viewing environment (perhaps even in infrared spectrum and with a lesser resolution camera) to support identification of viewer 106 and gathering of the viewer's associated location, eye/ear orientation or some other reference. Further SLSCs can support more accurate gatherings of, or further, reference information. The reference information may be based on sensor data within the viewer positioned devices (VPDs) as well, including a similar type of camera that captures a screen image and based thereon attempts to generate orientation and distance information.
  • There are many embodiments for carrying out trilateration or triangulation to gather at least a portion of the reference information. In a first embodiment, a VPD transmits only, and two or more SLSCs (e.g., first and second SLSCs 134 a and 134 b) receive only. In a second embodiment, a VPD receives only, and two or more SLSCs transmit only. In a third embodiment, two or more SLSCs transmit first and a VPD responds with time markers without (e.g., when accurate time synchronization between the SLSCs and the VPD exists) or with the SLSCs recording total round trip time and subtracting therefrom local turn-around time via marker info and the VPD (potentially unsynchronized) clocking. In a fourth embodiment, a VPD transmits first and two or more SLSCs respond with time markers without or with the VPD recording total round trip time and subtracting therefrom local turn-around time via marker info and the SLSCs (potentially unsynchronized) clocking.
  • In a fifth embodiment, any of the techniques described above with reference to the first through fourth embodiments may be used, and underlying communication circuitry may support location determination and normal, unrelated communications. In a sixth embodiment, any of the techniques described above with reference to the first through fifth embodiments may be used with the SLSCs (and/or/via processing circuitry 126) coordinating timing there amongst. In a seventh embodiment, any of the techniques described above with reference to the first through sixth embodiments may be used, and the actual calculations based on such gathered info can be performed anywhere (e.g., in whole or in part in the VPD, one or more of the SLSCs, and/or processing circuitry 126).
  • In certain implementations, adaptable screen assembly 122, driver circuitry 124, processing circuitry 126 reference information generation circuitry 110 b, and optionally some portion of audio system 128 are all integrated within a single housing (e.g., a television or other display device). In alternate embodiments, adaptable screen assembly 122, driver circuitry 124, at least some portion of processing circuitry 126, and optionally some portion of audio system 128 are integrated within a first housing (e.g., a television); reference information generation circuitry 110 b and optionally some portion of processing circuitry 126 are integrated within a second housing attached thereto (e.g., a set-top box, gateway device or media device); and at least a portion of audio system is integrated within one or more third housings (e.g., one or more stand-alone speaker assemblies) connected via wired or wireless connection(s) to driver circuitry 124.
  • Further embodiments place a first portion of the reference information generation circuitry 110 b within a housing of a first location support unit that is situated at a first location within the viewing environment. Similarly, a second portion of the reference information generation circuitry 110 b is disposed within a housing of a second location support unit that is placed at a second location within the viewing environment. In such embodiments, the first and second location support units coordinate their activities in interaction with the reference information generation circuitry 110 a to produce at least a portion of the reference information. Such production may involve 2D/3D trilateration, 2D/3D triangulation, or other location processing to yield such portion of the reference information. Still other arrangements and distributions of this circuitry may be used.
  • FIG. 1B shows an exemplary content capturing system 100 in accordance with an embodiment. As shown in FIG. 1B, content capturing system 100 includes a plurality of cameras 160A-160D, a plurality of background microphones 162A-162D, a plurality of first target microphones 154A-154D, a plurality of second target microphones 156A-156D, and a plurality of Nth target microphones 158A-158D. Cameras 160A-160D are configured to simultaneously capture respective instances of content that represent respective camera views (a.k.a. perspectives) of common subject matter. For example, such subject matter may include a plurality of audio targets, such as first audio target 152A, second audio target 152B, and Nth audio target 152N. Any two of the perspectives may be combined to provide a 3D viewing experience.
  • Background microphones 162A-162D simultaneously capture instances of audio content that correspond to the respective instances of content that are captured by cameras 160A-160D. For instance, background microphones 162A-162D may be placed proximate to (or attached to) respective cameras 160A-160D. First target microphones 154A-154D are placed proximate to first audio target 152A for capturing sounds that are provided by first audio target 152A. Second target microphones 156A-156D are placed proximate to second audio target 152B for capturing sounds that are provided by second audio target 152B. Nth target microphones 158A-158D are placed proximate to Nth audio target 152N for capturing sounds that are provided by Nth audio target 152N.
  • Microphones 154A, 156A, and 158A capture sounds of respective audio targets 152A, 152B, and 152N that are associated with the instance of content that is captured by first camera 160A. Microphones 154B, 156B, and 158B capture sounds of respective audio targets 152A, 152B, and 152N that are associated with the instance of content that is captured by second camera 160B. Microphones 154C, 156C, and 158C capture sounds of respective audio targets 152A, 152B, and 152N that are associated with the instance of content that is captured by third camera 160C. Microphones 154D, 156D, and 158D capture sounds of respective audio targets 152A, 152B, and 152N that are associated with the instance of content that is captured by fourth camera 160D.
  • For example, audio that is captured by microphones 162A, 154A, 156A, and 158A may be combined to provide the instance of audio that is associated with the instance of content that is captured by first camera 160A. Audio that is captured by microphones 162B, 154B, 156B, and 158B may be combined to provide the instance of audio that is associated with the instance of content that is captured by second camera 160B, and so on.
  • Four cameras 160A-160D, four background microphones 162A-162D, four first target microphones 154A-154D, four second target microphones 156A-156D, and four Nth target microphones 158A-158D are shown in FIG. 1B for illustrative purposes and are not intended to be limiting. It will be recognized that the configuration of FIG. 1B may include any suitable number of cameras, background microphones, and/or target microphones.
  • A sensory (auditory and/or visual) 3D experience can be captured as illustrated or produced conceptually based on the configuration of FIG. 1B. For instance, assume for purposes of illustration that processing circuitry 126 of FIG. 1A could receive only one audio set (e.g., a Dolby 5.1 audio set, a Dolby 7.1 audio set, or any other suitable type of audio set) along with a 3D8 data set. An audio set is a set of channels driving a corresponding set of speakers. If adaptable screen assembly 122 will only support 3D4, for example, processing circuitry 126 selectively delivers a set of four camera views out of the available eight to the screen. The selection of the four camera views is based on viewer reference information. For instance, perhaps cameras 1-4 are used to support a viewer (e.g., viewer 106) to the far left of adaptable screen assembly 122, and as such viewer moves to the right, processing circuitry 126 may choose cameras 2-5, then cameras 3-6, then cameras 4-7, and finally cameras 5-8 when the viewer has reached the far right of adaptable screen assembly 122. While within a certain camera set “zone”, an adaptive light manipulator portion of adaptable screen assembly 122 will operate to enhance the 3D visual experience. It will be recognized that this can be done with only single 3D4 video and a 3D4 screen without substituting cameras and via merely the light manipulator functionality.
  • With respect to the audio, the received single audio set can be manipulated by processing circuitry 126 to create a more realistic 3D experience in synchrony with such changing video. In order to provide a more realistic 3D listening experience, multiple portions of further audio data can be made available to processing circuitry 126 from which a set can be selected or produced and balanced based on the viewer's reference information.
  • Such multiple portions of further audio data may be captured and/or generated in any of a variety of ways. In a first example, a plurality of audio sets can be captured or constructed to service a selected set of viewer locations. With access to the plurality of audio sets, processing circuitry 126 can migrate between both the camera selections and the plurality of audio sets as the viewer moves from left to right as mentioned above, and receive a more accurate real world sensory experience. Transitions between audio sets might be smoothed via a weighted combination of channels of two adjacent audio sets.
  • In a second example, multiple pieces of audio can be captured or produced which correspond to the significant sounds origins (e.g., audio targets 152A, 152B, and 152N). In accordance with this example, each of the target microphones 154A-154D, 156A-156D, and 158A-158D may be a stereo microphone, and each may produce two channels. It is noted that rather than such channels being captured, the channels can be produced without physical microphones. For instance, the channels may be produced via software using the same reference points as the illustrated microphones that surround each point of origin. With the plurality of audio channels from target microphones 154A-154D, 156A-156D, and 158A-158D and background microphones 162A-162D, a plurality of audio sets can be produced and delivered downstream to processing circuitry 126 for selection therefrom to support a current viewer point of reference. Less significant sounds such as background music or background noise can be captured or produced as well via the background microphones 162A-162D, which also may be stereo microphones that each produce two channels.
  • In a third example, instead of generating the audio sets before delivery to processing circuitry 126 as mentioned in the second example above, processing circuitry 126 may be provided with all of the microphone channels related to the background microphones 162A-162D and the audio target microphones 154A-154D, 156A-156D, and 158A-158D. With such access, processing circuitry 126 can generate the audio sets itself that perhaps provides a more realistic 3D sensory environment.
  • It will be recognized that although processing circuitry 126 may receive all of the sets or audio data from all of the microphones 162A-162D, 154A-154D, 156A-156D, and 158A-158D, processing circuitry 126 may still “balance” the audio set output to conform to the viewer's position with reference to the actual viewing/listening room layout.
  • FIG. 2 shows an audio system 200, which is an exemplary viewer-located implementation of audio system 128 shown in FIG. 1, in accordance with an embodiment. As shown in FIG. 2 audio system 200 includes a support element 202, a right ear speaker assembly 204 a, and a left ear speaker assembly 204 b. Support element is configured to be placed on or around a viewer's head. Right ear speaker assembly 204 a is configured to be placed proximate the viewer's right ear. Left ear speaker assembly 204 b is configured to be placed proximate the viewer's left ear.
  • The right and left ear speaker assemblies 204 a and 204 b may include respective portions of a plurality of speakers. For example, left ear speaker assembly 204 b is shown to include speakers 130 d, 130 b, and 130 g of audio system 128 in FIG. 1 for illustrative purposes. Speakers 130 d, 130 b, and 130 g provide respective portions 132 d, 132 b, and 132 g of audio content to the viewer's left ear. Speaker 130 d is configured to be positioned toward a back edge of the viewer's left ear to cause the viewer to perceive audio content 132 d as originating from the left rear of the viewer. Speaker 130 b is configured to be positioned toward a side of the viewer's left ear to cause the viewer to perceive audio content 132 b as originating from the left side of the viewer. Speaker 130 g is configured to be positioned toward a front edge of the viewer's left ear to cause the viewer to perceive audio content 132 g as originating from the left front of the viewer.
  • Left ear speaker assembly 204 b further includes speakers 230 f and 230 h. Speaker 230 f provides portion 132 f of the audio content to the viewer's left ear. Speaker 230 f is configured to be positioned toward the front edge of the viewer's left ear. A corresponding speaker is included in speaker assembly 204 a and configured to be positioned toward a front edge of the viewer's right ear to provide portion 132 f of the audio content to the viewer's right ear. The provision of portion 132 f of the audio content is intended to cause the viewer to perceive the portion 132 f as originating in front of the viewer. Speaker 120 h is intended to serve as a subwoofer that provides relatively low frequency portion of the audio content to the viewer's left ear. The placement of speaker 120 h in speaker assembly 204 b may not substantially affect the viewer's perception of the originating location of the low-frequency portion of the audio content.
  • Right ear speaker assembly 204 a includes speakers that are complimentary to those that are included in left ear speaker assembly 204 b, though not shown in FIG. 2. For instance, the right ear speaker assembly 204 a may include speakers that are configured similarly to speakers 130 d, 130 b, and 130 g in left ear speaker assembly 204 b to provide respective audio portions 132 c, 132 a, and 132 e of the audio content. Right ear speaker assembly 204 a may also include a speaker corresponding to speaker 230 h in left ear speaker assembly 204 b to provide the low-frequency portion of the audio content to the viewer's right ear.
  • Speaker assembly 200 further includes a receiver 206, a battery 208, a transmitter 210, a right ear orientation element 210 a, a left ear orientation element 210 b, and circuitry 212. Right ear orientation element 210 a corresponds to (e.g., is attached to or is incorporated in) right ear speaker assembly 204 a and left ear orientation element 210 b that corresponds to (e.g., is attached to or is incorporated in) left ear speaker assembly 204 b. Right ear orientation element 210 a is configured to at least assist in determining a position of the viewer's right ear. Left ear orientation element 210 b is configured to at least assist in determining a position of the viewer's left ear. Right and left ear orientation elements 210 a and 210 b operate in conjunction to at least assist in detection of an orientation of the viewer's ears or head. For example, right and left ear orientation elements 210 a and 210 b may be included in reference information generation circuitry 110 a of FIG. 1. Further discussion of some techniques that may utilize right and left ear orientation elements 210 a and 210 b for detecting an orientation of the viewer's ears or head are discussed below with respect to FIGS. 7 and 8.
  • Transmitter 210 is configured to transmit information regarding the orientation of the viewer, such as information regarding the location of the viewer's left ear, information regarding the location of the viewer's right ear, and/or information regarding the orientation of the viewer's ears or head, for further processing (e.g., by reference information generation circuitry 110 b and/or processing circuitry 126 of FIG. 1).
  • Circuitry 212 includes a portion of driver circuitry 124 shown in FIG. 1 for controlling the speakers that are included in right and left speaker assemblies 204 a and 204 b. Circuitry 212 controls the speakers based on control signals that are received from processing circuitry 126 via receiver 206. For example, processing circuitry 126 may include a transmitter (not shown in FIG. 1) for transmitting the control signals to speaker assembly 200. Processing circuitry 126 may transmit the control signals via a wired or wireless communication pathway.
  • In an embodiment, circuitry 212 modifies the control signals that are received from processing circuitry 126 to take into consideration the orientation of the viewer. In accordance with this embodiment, circuitry 212 controls the speakers based on the modified control signals. In another embodiment, transmitter 210 transmits information regarding the orientation of the viewer to reference information generation circuitry 110 b. Reference information generation circuitry 110 b provides the information regarding the orientation of the viewer to processing circuitry 126, which modifies control signals based on the orientation of the viewer and transmits those modified control signals to speaker assembly 200. In accordance with this embodiment, circuitry 212 controls the speakers based on the modified control signals, which are received from processing circuitry 126 via receiver 206. Battery provides power to the various speakers.
  • Audio system 200 is shown in FIG. 2 to be implemented as headphones for illustrative purposes and is not intended to be limiting. Audio system 200 may be implemented using any suitable viewer-located or viewer-remote system, or a combination thereof. For example, audio system 200 may be implemented at least in part as eyewear that has speaker assemblies 204 a and 204B attached thereto.
  • Various embodiments of media system 102 of FIG. 1 will now be described in reference to FIGS. 3-8. Each of these embodiments utilizes different implementations of reference information generation circuitry 110 a and 110 b to produce reference information for provision to processing circuitry 126. These different implementations are described herein by way of example only and are not intended to be limiting. In each of FIGS. 3-8, portions 132 a-132 g of audio content are referred to cumulatively as audio content 132.
  • For example, FIG. 3 is a block diagram of a first embodiment of media system 102 in which reference information generation circuitry 110 a and 110 b jointly implement a triangulation technique to determine an estimated location of viewer 106 relative to adaptable screen assembly 122. As shown in FIG. 3, in accordance with this embodiment, reference information generation circuitry 110 a includes a transmitter 306 that is operable to transmit a location tracking signal 308. Location tracking signal 308 may comprise, for example, a radio frequency (RF) signal or other wireless signal. In further accordance with the embodiment shown in FIG. 3, reference information generation circuitry 110 b includes a plurality of receivers 302 1-302 N and triangulation circuitry 304 connected thereto. Receivers 302 1-302 N are operable to receive corresponding versions 310 1-310 N of location tracking signal 308. Triangulation circuitry 304 is operable to determine an estimated location of viewer 106 based on characteristics of the received versions 310 1-310 N of location tracking signal 308. For example, triangulation circuitry 304 may determine the estimated location of viewer 106 by measuring relative time delays between the received versions 310 1-310 N of location tracking signal 308, although this is only an example. The estimated location of viewer 106 is then provided by triangulation circuitry 304 to processing circuitry 126 as part of the above-described reference information.
  • Transmitter 306 is operable to transmit location tracking signal 308 on an on-going basis. For example, transmitter 306 may be configured to automatically transmit location tracking signal 308 on a periodic or continuous basis. Alternatively, transmitter 306 may intermittently transmit location tracking signal 308 responsive to certain activities of viewer 106 or other events. Triangulation circuitry 304 is operable to calculate an updated estimate of the location of viewer 106 based on the corresponding versions 310 1-310 N of location tracking signal 308 received over time. Since reference information generation circuitry 110 a comprises viewer-located circuitry, as viewer 106 moves around the viewing area in front of adaptable screen assembly 122, triangulation circuitry 304 will be able to produce updated estimates of the location of viewer 106 and provide such updated estimates to processing circuitry 126. Processing circuitry 126 will then cause modification of at least one of the one or more adaptable display characteristics of adaptable screen assembly 122 so that three-dimensional content will be displayed in a manner that is suitable or optimized for viewing at the current estimated location of viewer 106. In addition or alternatively, processing circuitry 126 will cause modification of a spatial orientation of audio content 132 to indicate the current estimated location of viewer 106.
  • As will be understood by persons skilled in the relevant art(s), to perform the triangulation function accurately, certain positioning of and/or spacing between receivers 302 1-302 N may be required. Depending upon the implementation, each of the receivers 302 1-302 N may be included at fixed spatially-dispersed locations within a single housing, and the housing may be placed in a particular location to achieve satisfactory or optimal results. Alternatively, separate housings may be used to contain different ones of receivers 302 1-302 N and may be placed at different locations in or around the viewing area to achieve satisfactory or optimal results. For instance, one or more of the receivers 302 1-302 N may be included in (or attached to) one or more speaker assemblies that are included in audio system 128.
  • FIG. 4 is a block diagram of a second embodiment of media system 102 in which reference information generation circuitry 110 a and 110 b jointly implement a triangulation technique to determine an estimated location of viewer 106 relative to adaptable screen assembly 122. As shown in FIG. 4, in accordance with this embodiment, reference information generation circuitry 110 b includes a plurality of transmitters 402 1-402 N that are operable to transmit a corresponding location tracking signal 412 1-412 N. Location tracking signals 412 1-412 N may comprise, for example, RF signals or other wireless signals. In further accordance with the embodiment shown in FIG. 4, reference information generation circuitry 110 a includes a plurality of receivers 406 1-406 N and triangulation circuitry 408 connected thereto. Receivers 406 1-406 N are operable to receive corresponding location tracking signals 412 1-412 N. Triangulation circuitry 408 is operable to determine an estimated location of viewer 106 based on characteristics of the received location tracking signals 412 1-412 N. For example, triangulation circuitry 408 may determine the estimated location of viewer 106 by determining a distance to each of transmitters 402 1-402 N based on the location signals received therefrom, although this is only an example. The estimated location of viewer 106 is then provided by triangulation circuitry 508 to reference information generation circuitry 110 b via a wired or wireless communication channel established between a transmitter 410 of reference generation circuitry 110 a and a receiver 404 of reference information generation circuitry 110 b. Reference information generation circuitry 110 b then provides the estimated location of viewer 106 to processing circuitry 126 as part of the above-described reference information.
  • Transmitters 402 1-402 N are operable to transmit location tracking signals 412 1-412 N on an on-going basis. For example, transmitters 402 1-402 N may be configured to automatically transmit location tracking signals 412 1-412 N on a periodic or continuous basis. Alternatively, transmitters 402 1-402 N may intermittently transmit location tracking signals 412 1-412 N responsive to certain activities of viewer 106 or other events. Triangulation circuitry 408 is operable to calculate an updated estimate of the location of viewer 106 based on the versions of location tracking signals 412 1-412 N received over time. Since reference information generation circuitry 110 a comprises viewer-located circuitry, as viewer 106 moves around the viewing area in front of adaptable screen assembly 122, triangulation circuitry 408 will be able to produce updated estimates of the location of viewer 106 and provide such updated estimates to reference information generation circuitry 110 b for forwarding to processing circuitry 126. Processing circuitry 126 will then cause modification of at least one of the one or more adaptable display characteristics of adaptable screen assembly 122 so that three-dimensional content will be displayed in a manner that is suitable or optimized for viewing at the current estimated location of viewer 106. In addition or alternatively, processing circuitry 126 will cause modification of a spatial orientation of audio content 132 to indicate the current estimated location of viewer 106.
  • As will be understood by persons skilled in the relevant art(s), to perform the triangulation function accurately, certain positioning of and/or spacing between transmitters 402 1-402 N may be required. Depending upon the implementation, each of the transmitters 402 1-402 N may be included at fixed locations within a single housing, and the housing may be placed in a particular location to achieve satisfactory or optimal results. Alternatively, separate housings may be used to contain different ones of transmitters 402 1-402 N and may be placed at different locations in or around the viewing area to achieve satisfactory or optimal results. For instance, one or more of the transmitters 402 1-402 N may be included in (or attached to) one or more speaker assemblies that are included in audio system 128.
  • FIG. 5 is a block diagram of a further embodiment of media system 102 in which reference information generation circuitry 110 a and 110 b jointly implement an infrared (IR) distance measurement system to help determine an estimated location of viewer 106 relative to adaptable screen assembly 122. As shown in FIG. 5, in accordance with this embodiment, reference information generation circuitry 110 b includes one or more IR light sources 502 and reference information generation circuitry 110 a includes one or more IR sensors 506. IR sensor(s) 506 are configured to sense IR light 508 emitted by IR light source(s) 502 and to analyze characteristics associated with such light to help generate information concerning an estimated location of viewer 106 with respect to adaptable screen assembly 122. The estimated location of viewer 106 may then be provided by reference information generation circuitry 110 a to reference information generation circuitry 110 b via a wired or wireless communication channel established between a transmitter 510 of reference generation circuitry 110 a and a receiver 504 of reference information generation circuitry 110 b. Reference information generation circuitry 110 b then provides the estimated location of viewer 106 to processing circuitry 126 as part of the above-described reference information. Processing circuitry 126 will then cause modification of at least one of the one or more adaptable display characteristics of adaptable screen assembly 122 so that three-dimensional content will be displayed in a manner that is suitable or optimized for viewing at the current estimated location of viewer 106. In addition or alternatively, processing circuitry 126 will cause modification of a spatial orientation of audio content 132 to indicate the current estimated location of viewer 106.
  • In alternate implementations, the IR distance measurement system may be implemented by incorporating one or more IR light sources into reference information generation circuitry 110 a and incorporating one or more IR sensors into reference information generation circuitry 110 b. In a still further implementation, reference information generation circuitry 110 b includes one or more IR light sources for projecting IR light toward the viewing area and one or more IR sensors for sensing IR light reflected from objects in the viewing area. Characteristics of the IR light reflected from the objects in the viewing area may then be analyzed to help estimate a current location of viewer 106. A like system could also be implemented by reference information generation circuitry 110 a, except that the IR light would be projected out from the viewer's location instead of toward the viewing area. Still other IR distance measurement systems may be used to generate the aforementioned reference information.
  • FIG. 6 is a block diagram of a further embodiment of media system 102 in which reference information generation circuitry 110 a and 110 b jointly implement a magnetic field detection system to help determine an estimated location of viewer 106 relative to adaptable screen assembly 122. As shown in FIG. 6, in accordance with this embodiment, reference information generation circuitry 110 a includes one or more magnetic field sources 604 and reference information generation circuitry 110 b includes one or more magnetic field sensors 602. Magnetic field sensor(s) 602 are configured to sense a magnetic field(s) generated by magnetic field source(s) 604 and to analyze characteristics associated therewith to help generate information concerning an estimated location of viewer 106 with respect to adaptable screen assembly 122. The estimated location of viewer 106 may then be provided by reference information generation circuitry 110 b to processing circuitry 126 as part of the above-described reference information. Processing circuitry 126 will then cause modification of at least one of the one or more adaptable display characteristics of adaptable screen assembly 122 so that three-dimensional content will be displayed in a manner that is suitable or optimized for viewing at the current estimated location of viewer 106. In addition or alternatively, processing circuitry 126 will cause modification of a spatial orientation of audio content 132 to indicate the current estimated location of viewer 106.
  • In alternate implementations, the magnetic field detection system may be implemented by incorporating one or more magnetic field sources into reference information generation circuitry 110 b and incorporating one or more magnetic field sensors into reference information generation circuitry 110 a. In a still further implementation, reference information generation circuitry 110 b includes one or more magnetic field sources for projecting magnetic fields toward the viewing area and one or more magnetic field sensors for sensing magnetic fields reflected from objects in the viewing area. Characteristics of the magnetic fields reflected from the objects in the viewing area may then be analyzed to help estimate a current location of viewer 106. A like system could also be implemented by reference information generation circuitry 110 a, except that the magnetic fields would be projected out from the viewer's location instead of toward the viewing area. Still other magnetic field detection systems may be used to generate the aforementioned reference information.
  • FIG. 7 is a block diagram of a further embodiment of media system 102 in which reference information generation circuitry 110 a includes one or more cameras and one or more microphones for facilitating the generation of the aforementioned reference information. In particular, as shown in FIG. 7, reference information generation circuitry 110 a includes one or more cameras 708, one or more microphones 710, and a transmitter 712.
  • Camera(s) 708 operate to capture images of the viewing environment of viewer 106 and are preferably carried or mounted on viewer 106 in such a manner so as to capture images that correspond to a field of vision of viewer 106. These images are then transmitted by transmitter 712 to a receiver 702 in reference information generation circuitry 110 b via a wired or wireless communication channel. Such images are then processed by image processing circuitry 704 within reference information generation circuitry 110 b. Image processing circuitry 704 may process such images to determine a current estimated location and/or head orientation of viewer 106.
  • For example, image processing circuitry 704 may compare such images to one or more reference images in order to determine a current estimated location and/or head orientation of viewer 106. The reference images may comprise, for example, images of adaptable screen assembly 122, speakers that are included in audio system 128, and/or other objects or points of interest normally viewable by a viewer of media system 102 captured from one or more locations and at one or more orientations within the viewing area.
  • As another example, image processing circuitry 704 may calculate measurements associated with representations of objects or points of interest captured in such images and then compare those measurements to known measurements associated with the objects or points of interest to determine a current estimated location and/or head orientation of viewer 106. Still other techniques may be used to process such images to determine an estimated current location and/or head orientation of viewer 106.
  • Image processing circuitry 704 then provides the estimated location and/or head orientation of viewer 106 to processing circuitry 126 as part of the above-described reference information. Processing circuitry 126 will then cause modification of at least one of the one or more adaptable display characteristics of adaptable screen assembly 122 so that three-dimensional content will be displayed in a manner that is suitable or optimized for viewing at the current estimated location and/or in accordance with the current estimated head orientation of viewer 106. In addition or alternatively, processing circuitry 126 will cause modification of a spatial orientation of audio content 132 to indicate the current estimated location and/or current estimated head orientation of viewer 106.
  • It is noted that the images captured by camera(s) 708 and/or processed by image processing circuitry 704 need not comprise images of the type intended for viewing by human eyes. Rather, such images may comprise images of a resolution or frequency range that is beyond the rods/cones capability of the human eye.
  • In a further embodiment, images of adaptable screen assembly 122 captured by camera(s) 708 are processed by image processing circuitry 704 to determine or measure one or more qualities relating to how adaptable screen assembly 122 is currently presenting two-dimensional or three-dimensional content to viewer 106. Such qualities may include but are not limited to image sharpness, brightness, contrast, resolution, and colors. Image processing circuitry 704 provides information concerning the determined or measured qualities to processing circuitry 126. If processing circuitry 126 determines that a particular quality of the presentation is not acceptable, processing circuitry 126 can implement changes to one or more of the adaptable display characteristics of adaptable screen assembly 122 to adjust that particular quality until it is deemed acceptable. In this manner, media system 102 can implement an image-based feedback mechanism for improving the quality of presentation of two-dimensional and three-dimensional content to a viewer.
  • Microphone(s) 710 included within reference information generation circuitry 110 a operate to capture one or more audio signal(s) which are transmitted by transmitter 712 to receiver 702 in reference information generation circuitry 110 b. Such audio signal(s) are then processed by audio processing circuitry 706 within reference information generation circuitry 110 b. Audio processing circuitry 706 may process such audio signal(s) to determine a current estimated location and/or head orientation of viewer 106. For example, audio processing circuitry 706 may process such audio signal(s) to determine a direction of arrival associated with one or more speakers of audio system 128 that are located in or around the viewing environment. Such directions of arrival may then be utilized to estimate a current location and/or head orientation of viewer 106. Still other techniques may be used to process such audio signal(s) to determine an estimated current location and/or head orientation of viewer 106. Audio processing circuitry 706 then provides the estimated location and/or head orientation of viewer 106 to processing circuitry 126 as part of the above-described reference information. Processing circuitry 126 will then cause modification of at least one of the one or more adaptable display characteristics of adaptable screen assembly 122 so that three-dimensional content will be displayed in a manner that is suitable or optimized for viewing at the current estimated location and/or in accordance with the current estimated head orientation of viewer 106. In addition or alternatively, processing circuitry 126 will cause modification of a spatial orientation of audio content 132 to indicate the current estimated location and/or current estimated head orientation of viewer 106.
  • In a further embodiment, audio signal(s) captured by microphone(s) 710 are processed by audio processing circuitry 706 to determine or measure one or more qualities relating to how audio system 128 is currently presenting audio content 132 to viewer 106. Such qualities may include but are not limited to loudness, balance, surround-sound, delay, and audio spatial orientation performance. Audio processing circuitry 706 provides information concerning the determined or measured qualities to processing circuitry 126. If processing circuitry 126 determines that a particular quality of the presentation is not acceptable, processing circuitry 126 can implement changes to one or more settings or characteristics of audio system 128 to adjust that particular quality until it is deemed acceptable. In this manner, media system 102 can implement an audio-based feedback mechanism for improving the quality of presentation of audio content 132 to a viewer.
  • In a still further embodiment, microphone(s) 710 may be used to allow viewer 106 to deliver voice commands for controlling certain aspects of media system 102, including the manner in which two-dimensional and three-dimensional content is presented via adaptable screen assembly 122. In accordance with such an embodiment, audio processing circuitry 706 may comprise circuitry for recognizing and extracting such voice commands from the audio signal(s) captured by microphone(s) 710 and passing the commands to processing circuitry 126. In response to receiving such commands, processing circuitry 126 may cause a spatial orientation of audio content 132 and/or at least one of the one or more adaptable display characteristics of adaptable screen assembly 122 relating to the presentation of two-dimensional or three-dimensional content to be modified. Such voice commands may be used for other purposes as well, including controlling what audio content is provided to viewer 106 via audio system 128 and//or what visual content is delivered to viewer 106 via adaptable screen assembly 122.
  • In the embodiment of media system 102 shown in FIG. 7, image processing circuitry 704 and audio processing circuitry 706 are shown as part of reference information generation circuitry 110 b. It is noted that in alternate embodiments, such circuitry may instead be included within reference information generation circuitry 110 a. In accordance with still further embodiments, image processing circuitry and/or audio processing circuitry may be distributed among reference information generation circuitry 110 a and 110 b.
  • FIG. 8 is a block diagram of a further embodiment of media system 102 in which reference information generation circuitry 110 a includes a head orientation sensor 808 and eye tracking circuitry 806 for determining a head orientation and point of gaze, respectively, of viewer 106. Head orientation sensor(s) 808 may include, for example and without limitation, an accelerometer or other device designed to detect motion in three-dimensions or tilting in a two-dimensional reference plane. Eye tracking circuitry 806 may comprise any system or device suitable for tracking the motion of the eyes of a viewer to determine a point of gaze therefrom. The determined head orientation and point of gaze of viewer 106 is transmitted by a transmitter 804 included in reference information generation circuitry 110 a to a receiver 802 included in reference information generation circuitry 110 b via a wired or wireless communication channel. The determined head orientation and point of gaze of viewer 106 may then be provided by reference information generation circuitry 110 b to processing circuitry 126 as part of the above-described reference information. Processing circuitry 126 will then cause modification of at least one of the one or more adaptable display characteristics of adaptable screen assembly 122 so that three-dimensional content will be displayed in a manner that is suitable or optimized for viewing by viewer 106 in light of the determined head orientation and/or point of gaze of viewer 106. In addition or alternatively, processing circuitry 126 will cause modification of a spatial orientation of audio content 132 to indicate the current estimated head orientation and/or current estimated point of gaze of viewer 106.
  • FIG. 9 is a block diagram of a further embodiment of media system 102 in which reference information is generated entirely by non-viewer-located reference information generation circuitry 110. As shown in FIG. 9, reference information generation circuitry 110 includes one or more camera(s) 902 and one or more microphone(s) 904.
  • Camera(s) 902 operate to capture images of a viewing area in front of adaptable screen assembly 122. The images may be captured using ambient light or, alternatively, reference information generation circuitry 110 may include one or more light sources (e.g., IR light sources or other types of light sources) that operate to radiate light into the viewing area so that camera(s) 902 may capture light reflected from people and objects in the viewing area. The images captured by camera(s) 902 are processed by image/audio processing circuitry 906 to determine an estimated location of viewer 106. Similarly, microphone(s) 904 operate to capture audio signals that include content 132 from speakers that are included in audio system 128. For instance, such speakers may be located in and around the viewing area in front of adaptable screen assembly 122. The audio signals captured by microphone(s) 904 are also processed by image/audio processing circuitry 906 to determine an estimated location of viewer 106. Image/audio processing circuitry 906 then provides the estimated location of viewer 106 to processing circuitry 126 as part of the above-described reference information. Processing circuitry 126 will then cause modification of at least one of the one or more adaptable display characteristics of adaptable screen assembly 122 so that three-dimensional content will be displayed in a manner that is suitable or optimized for viewing at the current estimated location of viewer 106. In addition or alternatively, processing circuitry 126 will cause modification of a spatial orientation of audio content 132 to indicate the current estimated location of viewer 106.
  • The audio signal(s) captured by microphone(s) 904 may also be processed to detect and extract voice commands uttered by viewer 106, such voice commands being executed by processing circuitry 126 to facilitate viewer control over a spatial orientation of audio content 132 and/or at least one of the one or more adaptable display characteristics of adaptable screen assembly 122 relating to the presentation of two-dimensional or three-dimensional content. As noted above with respect to the embodiment shown in FIG. 7, such voice commands may be used for other purposes as well, including controlling what audio content is provided to viewer 106 via audio system 128 and//or what visual content is delivered to viewer 106 via adaptable screen assembly 122.
  • The various embodiments of reference information generation circuitry described above in regard FIGS. 3-9 have been provided herein by way of example only are not intended to be limiting. Persons skilled in the relevant art(s) will appreciate that other types of reference information generation circuitry may be used to produce reference information corresponding to at least one positional characteristic of a viewing reference of a viewer. For example, ultra wide band (UWB) is yet another transmission approach that may be used to support location determination. In accordance with this example, information generation circuitry may include UWB circuitry (e.g., UWB receivers and/or UWB transmitters). It is further noted that certain features of the reference information generation circuitry described in regard to FIGS. 3-9 may be combined to produce additional embodiments. For example, an embodiment may utilize a combination of triangulation, IR distancing, head orientation sensing and eye tracking to generate extremely precise reference information concerning a viewing reference of a viewer.
  • FIG. 10 depicts an exemplary headset 1000 that may implement various features of reference information generation circuitry 110 a as described above. As shown in FIG. 10, headset 1000 includes a frame 1002 that supports a right lens 1004A and a left lens 1004B. Right lens 1004A and left lens 1004B may comprise, for example, colored, polarizing, or shuttering lenses that enable a viewer to perceive certain types of three-dimensional content delivered by certain types of screen assemblies. Thus, lenses 1004A and 1004B can be used to control how video content that is rendered to a screen assembly is perceived by a viewer. In one embodiment, frame 1002 is mounted in such a manner that it can be flipped up and away from the eyes of a viewer wearing headset 1000 when the viewer does not desire or need to use such lenses.
  • As further shown in FIG. 10, headset 1000 also includes a right speaker assembly 1006A and a left speaker assembly 1006B, each of which may include one or more speakers. Such speakers can be used to deliver audio content to a viewer of a screen assembly, wherein the audio content is related to video content displayed on the screen assembly and viewable by the viewer. Such speakers can deliver other types of audio content, as well. In an embodiment, right speaker assembly 1006A and left speaker assembly 1006B include right ear speaker assembly 204 a and left ear speaker assembly 204 b, respectively, as described above with reference to FIG. 2.
  • Headset 1000 further includes a microphone 1002. Microphone 1002 may be used to support any of the functionality described above in reference to microphone(s) 710 of FIG. 7. In certain alternate implementations, headset 1000 may include additional microphones beyond microphone 1002.
  • Headset 1000 still further includes a battery compartment 1010 for housing one or more batteries. Such battery or batteries may or may not be rechargeable depending upon the implementation. In a further embodiment, headset 1000 includes an interface for connecting to an external power source. The connection to the external power source may be made to deliver power to headset 1000 as well as to recharge the battery or batteries stored in battery compartment 1010.
  • Headset 1000 further includes a forward housing 1002, a right-side housing 1008 and a left-side housing 1012. These housings may be used to store any number of the various types of viewer-located reference information generation circuitry described above in reference to FIGS. 3-8. For example, in one embodiment, forward housing 1002 houses one or more cameras that support any of the functionality described above in reference to camera(s) 708 of FIG. 7, and each side housing stores a transmitter or receiver used for implementing a triangulation system for determining a location of a viewer wearing headset 1000. Such compartments may also be used to house circuitry used for implementing features relating to IR distancing, magnetic field detection, head orientation sensing, eye tracking, or the like, as described above.
  • Headset 1000 of FIG. 10 is merely one example of how reference information generation circuitry 110 a may be worn by or carried by a user. Any of a wide variety of portable or wearable articles or devices may be used. Furthermore, reference information generation circuitry 110 a may be distributed among multiple articles or devices worn or carried by a viewer.
  • For example, FIG. 11A depicts an implementation in which reference information generation circuitry 110 a is distributed among a headset 1102 and a remote control 1104, which are connected to each other by a wired communication link in the form of a cable 1106. In accordance with this example, remote control 1104 may be connected to other portions of a media system (e.g., reference information generation circuitry 110 b) via a wireless communication link and act as a conduit for communication between headset 1102 and such other portions. In further accordance with this example, headset 1102 may include speakers for delivering audio content to a viewer, as well as one or more cameras, a head orientation sensor, and an eye movement tracker which support functionality for determining a location, head orientation and point of gaze of a viewer, as described above. Information generated by these components can be delivered via remote control 1104 to other components of the media system for processing and/or can be processed by components within remote control 1104. Remote control 1104 may also include circuitry for supporting the production of reference information, such as for example, receivers or transmitters used to support triangulation-based viewer location, microphones, or the like.
  • FIG. 11B depicts a different implementation in which reference information generation circuitry 110 a is distributed among a headset 1112 and a laptop computer 1114, which are connected to each other by a wireless communication link 1116, such as for example a Bluetooth® connection. In accordance with this example, laptop computer 1114 may be connected to other portions of a media system (e.g., reference information generation circuitry 110 b) via a wired or wireless communication link and act as a conduit for communication between headset 1112 and such other portions. In further accordance with this example, headset 1112 may include speakers for delivering audio content to a viewer as well as a microphone for capturing voice commands from a viewer and/or other audio content. Headset 1112 may also include one or more cameras, a head orientation sensor, and an eye movement tracker that support functionality for determining a location, head orientation and point of gaze of a viewer as described above. Information generated by these components can be delivered via laptop computer 1114 to other components of the media system for processing and/or can be processed by components within laptop computer 1114. Laptop computer 1114 may also include circuitry for supporting the production of reference information, such as for example, receivers or transmitters used to support triangulation-based viewer location, additional cameras and microphones, or the like.
  • FIGS. 11A and 11B provide merely a few examples of how reference information generation circuitry 110 a may be distributed among multiple articles or devices worn or carried by a viewer. These examples are not intended to be limiting. Persons skilled in the relevant art(s) will appreciate that a large number of other variations may be used.
  • FIG. 12 depicts a flowchart 1200 of a method for presenting three-dimensional content to a viewer in accordance with an embodiment. The method of flowchart 1200 will be described herein with continued reference to media system 102 of FIG. 1. However, the method is not limited to that system.
  • As shown in FIG. 12, the method of flowchart 1200 begins at step 1202, in which first circuitry at least assists in producing reference information corresponding to at least one positional characteristic of an orientation of a viewer. The first circuitry may comprise, for example, reference information generation circuitry 110 a and/or reference information generation circuitry 110 b as described above. The orientation of the viewer may comprise any of a number of positional characteristics that affect how three-dimensional visual content displayed via an adaptable screen assembly and/or audio content that is associated therewith will be perceived by the viewer. As noted above, such positional characteristics may include, for example and without limitation, a position or location of the viewer relative to the adaptable screen assembly, a head orientation of the viewer and a point of gaze of the viewer. The reference information may be produced using any of the approaches previously described herein as well as additional approaches not described herein.
  • At step 1204, the reference information produced during step 1202 is provided to second circuitry.
  • At step 1206, the second circuitry issues one or more control signals to cause modification of at least one of one or more adaptable display characteristics of an adaptable screen assembly. The modification corresponds at least in part to the reference information. The second circuitry may comprise, for example, processing circuitry 126 as described above. Such processing circuitry 126 may cause the modification of the at least one of the one or more adaptable display characteristics by sending one or more suitable control signals to driver circuitry 124. The one or more adaptable display characteristics may include, but are not limited to, a configuration of an adaptable light manipulator that forms part of the adaptable screen assembly, a manner in which images are mapped to display pixels in a pixel array that forms part of the adaptable screen assembly, and/or a distance and angular alignment between such an adaptable light manipulator and such a pixel array.
  • At step 1208, the second circuitry issues one or more control signals to cause modification of a spatial orientation of audio content. The modification corresponds at least in part to the reference information. The second circuitry may comprise, for example, processing circuitry 126, which may cause the modification of the spatial orientation of the audio content by sending one or more suitable control signals to driver circuitry 124. For example, the modification of the spatial orientation may include modification of an amplitude of sound that corresponds to a specified sound source that corresponds to (e.g., is depicted in) a three-dimensional presentation that is supported by the adaptable screen assembly. In another example, the modification of the spatial orientation may include modification of a delay that is associated with such sound.
  • FIG. 13 depicts a flowchart 1300 of a method for delivering video output and audio output to a viewer based at least in part on positional characteristic(s) relating to an orientation of the viewer in accordance with an embodiment. The method of flowchart 1300 will be described herein with continued reference to media system 102 of FIG. 1. However, the method is not limited to that system.
  • As shown in FIG. 13, the method of flowchart 1300 begins at step 1302, in which at least one positional characteristic relating to a first orientation of a viewer within a premises is identified. Such positional characteristics may include, for example and without limitation, a position or location of the viewer relative to one or more objects (e.g., an adaptable screen assembly, one or more speakers, etc.) in the premises, a head orientation of the viewer, and/or a point of gaze of the viewer. The position or location of the viewer relative to an object (i.e., the relative position or location of the viewer) may include a distance from the object or some reference point associated therewith, and such distance may include both horizontal distance and elevation. The position or location of the viewer may also include eye locations of the viewer. The head orientation of the viewer may include a degree of tilt and/or rotation of the head of the viewer. In an exemplary implementation, reference information generation circuitry 110 a and/or reference information generation circuitry 110 b identifies the at least one positional characteristic relating to a first orientation of viewer 106.
  • At step 1304, a video output, which is tailored based at least in part on the at least one positional characteristic, for a three-dimensional visual presentation is delivered to the viewer in the first orientation. In an exemplary implementation, adaptable screen assembly 122 delivers the video output for a three-dimensional visual presentation to viewer 106 in the first orientation.
  • At step 1306, audio output is tailored based at least in part on the at least one positional characteristic. In an example, tailoring the audio output may include selecting the audio output from a plurality of audio outputs. In accordance with this example, each of the plurality of audio outputs may correspond to a respective three-dimensional view, a respective designated (e.g., predetermined) location and/or respective designated (e.g., predetermined) head orientation of the viewer, etc. For instance, the audio output may be selected based on a signal that is generated in response to input from the viewer. In another example, tailoring the audio output may include generating the audio output. In yet another example, tailoring the audio output may include changing an amplitude of the audio output. In still another example, tailoring the audio output may include adding a delay to the audio output or removing a delay from the audio output. In an exemplary implementation, driver circuitry 124 tailors audio output corresponding to audio content 132 based at least in part on the at least one positional characteristic in accordance with control signals that are received from processing circuitry 126.
  • In an embodiment, tailoring the audio output may be based at least in part on image data that is captured within the premises. For example, the image data may be captured by cameras 708 as described above with reference to FIG. 7 and/or by cameras 1002 as described above with reference to FIG. 10. For instance, the image data may be captured from a perspective of the viewer and/or from a perspective that is directed toward the viewer.
  • In another embodiment, tailoring the audio output may be based at least in part on captured audio data that is captured within the premises. For example, the audio data may be captured by microphones 710 as described above with reference to FIG. 7 and/or by microphones 1004 as described above with reference to FIG. 10. For instance, the audio data may be captured from a perspective of the viewer and/or from a perspective that is directed toward the viewer.
  • At step 1308, the audio output, which is tailored based at least in part on the at least one positional characteristic, is delivered to audibly supplement the video output for the viewer in the first orientation. In an exemplary implementation, audio system 128 delivers the audio output to audibly supplement the video output for viewer 106 in the first orientation.
  • In some embodiments, one or more steps 1302, 1304, 1306, and/or 1308 of flowchart 1300 may not be performed. Moreover, steps in addition to or in lieu of steps 1302, 1304, 1306, and/or 1308 may be performed.
  • FIG. 14 depicts a flowchart 1400 of a method for delivering an audio experience for ears of a listener via a plurality of speakers in accordance with an embodiment. The method of flowchart 1400 will be described herein with continued reference to media system 102 of FIG. 1. However, the method is not limited to that system.
  • As shown in FIG. 14, the method of flowchart 1400 begins at step 1402, in which ears of a listener are detected to be in a first orientation with respect to a plurality of speakers. In an exemplary implementation, reference information generation circuitry 110 a and/or reference information generation circuitry 110 b detects that the ears of viewer 106 are in a first orientation with respect to speakers 130 a-130 g.
  • At step 1404, first audio output that is based on audio content is delivered to attempt to establish, with a spatial orientation of the audio content, an audio experience for the ears of the listener in the first orientation. For instance, the spatial orientation of the audio content may be configured to accord with the ears of the listener being in the first orientation to provide the first audio output. In an exemplary implementation, audio system 128 delivers the first audio output that is based on the audio content to attempt to establish the audio experience for the ears of viewer 106 in the first orientation.
  • At step 1406, the ears of the listener are detected to be in a second orientation with respect to the plurality of speakers. In an exemplary implementation, reference information generation circuitry 110 a and/or reference information generation circuitry 110 b detects that the ears of viewer 106 are in a second orientation with respect to speakers 130 a-130 g.
  • At step 1408, second audio output is delivered to attempt to establish, with the spatial orientation of the audio content, the audio experience for the ears of the listener in the second orientation. For instance, the spatial orientation of the audio content may be modified to accord with the ears of the listener being in the second orientation to provide the second audio output. In an exemplary implementation, audio system 128 delivers the second audio output to attempt to establish the audio experience for the ears of viewer 106 in the second orientation.
  • At step 1410, a three-dimensional visual presentation that is tailored based on the listener being at a first location is delivered. For instance, element(s) of an adaptable screen display may be controlled to deliver the three-dimensional visual presentation. In an exemplary implementation, adaptable screen assembly 122 delivers the three-dimensional visual presentation that is tailored based on viewer 106 being at the first location.
  • At step 1412, a move by the listener to a second location is detected. In an exemplary implementation, reference information generation circuitry 110 a and/or reference information generation circuitry 110 b detects a move by viewer 106 to the second location.
  • At step 1414, third audio output is delivered to establish, with the spatial orientation of the audio content, the audio experience for the ears of the listener at the second location. For instance, the spatial orientation of the audio content may be modified to accord with the ears of the listener being at the second location to provide the second audio output. In an exemplary implementation, audio system 128 delivers the third audio output to establish the audio experience for the ears of viewer 106 at the second location.
  • At step 1416, a second three-dimensional visual presentation that is tailored based on the listener being at the second location is delivered. For instance, element(s) of an adaptable screen display may be controlled to deliver the second three-dimensional visual presentation. In an exemplary implementation, adaptable screen assembly 122 delivers the second three-dimensional visual presentation that is tailored based on viewer 106 being at the second location.
  • In some embodiments, one or more steps 1402, 1404, 1406, 1408, 1410, 1412, 1414, and/or 1416 of flowchart 1400 may not be performed. Moreover, steps in addition to or in lieu of steps 1402, 1404, 1406, 1408, 1410, 1412, 1414, and/or 1416 may be performed.
  • FIG. 15 is a block diagram of a media system 1502 that simultaneously presents first three-dimensional content to a first viewer 1506 having a first viewing reference 1508 and second three-dimensional content to a second viewer 1536 having a second viewing reference 1538 in accordance with an embodiment. As shown in FIG. 15, media system 1502 includes an adaptable screen assembly 1522, driver circuitry 1524, processing circuitry 1526, audio system 1528, reference information generation circuitry 1510 a, reference information generation circuitry 1510 b, and reference information generation circuitry 1510 c.
  • Generally speaking, media system 1502 operates to deliver light that includes one or more viewable images to a viewing area that includes first viewer 1506 and second viewer 1536. Media system 1502 also operates to deliver audio content that is associated with the one or more viewable images toward the viewing area. Media system 1502 may include, for example and without limitation, a television, a projection system, a home theater system, a monitor, a computing device (e.g., desktop computer, laptop computer, tablet computer) or a handheld device (e.g., a cellular phone, smart phone, personal media player, personal digital assistant), wherein the computing device or handheld device has at least one attached or integrated display.
  • Display characteristics of adaptable screen assembly 1522 may be modified to simultaneously present a first three-dimensional view of first visual content to first viewer 1506 and a second three-dimensional view of second visual content to second viewer 1536. Adaptable screen assemblies and manners of operating the same that can achieve this are described in the aforementioned, incorporated U.S. patent application Ser. No. 12/845,461, filed on Jul. 28, 2010 and entitled “Display Supporting Multiple Simultaneous 3D Views.” Such display characteristics may include, but are not limited to, the configuration of one or more adaptable light manipulators, the manner in which images are mapped to display pixels in a pixel array, the distance between the pixel array and the adaptable light manipulator(s), the angular orientation of the adaptable light manipulator(s), and the like.
  • The adaptation of the display characteristics of adaptable screen assembly 1522 may be carried out, in part, by the sending coordinated drive signals to various elements (e.g., a non-uniform backlight generator, a pixel array and an adaptable light manipulator) that comprise adaptable screen assembly 1522. This function is performed by driver circuitry 1524 responsive to the receipt of control signals from processing circuitry 1526. As noted above, the manner in which such coordinated drive signals may be generated is described in the aforementioned, incorporated U.S. patent application Ser. No. ______ (Attorney Docket No. A05.01240000), filed on even date herewith and entitled “Coordinated Driving of Adaptable Light Manipulator, Backlighting and Pixel Array in Support of Adaptable 2D and 3D Displays.”
  • Audio system 1528 is configured to deliver first audio content 1532 that is associated with the first three-dimensional view of the first visual content for at least first viewer 1506 and second audio content 1542 that is associated with the second three-dimensional view of the second visual content for at least second viewer 1536. Audio system 1528 delivers first audio content 1532 and second audio content 1542 in accordance with respective spatial orientations that are controlled by driver circuitry 1524.
  • For example, driver circuitry 1524 may modify a first spatial orientation of first audio content 1532 to take into consideration a change in orientation of first viewer 1506 with respect to sound sources that correspond to (e.g., are depicted in) the first three-dimensional view and/or changes in orientation of such sound sources with respect to first viewer 1506. Such sound sources may include a representation of second viewer 1536 in the context of the first three-dimensional view, for example. In another example, driver circuitry 1524 may modify a second spatial orientation of second audio content 1542 to take into consideration a change in orientation of second viewer 1536 with respect to sound sources that correspond to (e.g., are depicted in) the second three-dimensional view and/or changes in orientation of such sound sources with respect to second viewer 1536. Such sound sources may include a representation of first viewer 1506 in the context of the second three-dimensional view, for example. Driver circuitry 1524 modifies the first spatial orientation of first audio content 1532 and/or the second spatial orientation of second audio content 1542 responsive to the receipt of control signals from processing circuitry 1526.
  • In one embodiment, audio system 1528 includes first viewer-specific speakers 1544 (e.g., in a first headset or earbuds) worn or otherwise used by first viewer 1506 and second viewer-specific speakers 1546 (e.g., in a second headset or earbuds) worn or otherwise used by second viewer 1536. In accordance with this embodiment, audio system 1528 may provide first audio content 1532 to first viewer 1506 via first viewer-specific speakers 1544, and audio system 1528 may provide second audio content 1542 to second viewer 1536 via second viewer-specific speakers 1546. Accordingly, first viewer 1506 may hear first audio content 1532 without hearing second audio content 1542, and second viewer 1536 may hear second audio content 1542 without hearing first audio content 1532.
  • In another embodiment, audio system 1528 includes viewer-agnostic speakers (e.g., speakers mounted around a room in which first and second viewers 1506 and 1536 are located) and second viewer-specific speakers 1546. In accordance with this embodiment, audio system 1528 may provide first audio content 1532 to both first viewer 1506 and second viewer 1536 via the viewer-agnostic speakers, and audio system 1528 may provide second audio content 1542 to second viewer 1536 via second viewer-specific speakers 1546. Accordingly, first viewer 1506 may hear first audio content 1532 without hearing second audio content 1542, and second viewer 1536 may hear both first audio content 1532 and second audio content 1542. On the other hand, second viewer 1536 may hear second audio content 1542 but not first audio content 1532 if noise cancellation techniques are employed to hinder second viewer 1536 from perceiving first audio content 1532. Audio system 1528 may or may not include first viewer-specific speakers 1544 for providing the first audio content 1532 to first viewer 1506, in addition to the viewer-agnostic speakers providing the first audio content 1532 to both first viewer 1506 and second viewer 1536.
  • In yet another embodiment, audio system 1528 includes viewer-agnostic speakers but no viewer-specific speakers (e.g., viewer-specific speakers 1544 and 1546). In accordance with this embodiment, reference information generation circuitry 1510 a, 1510 b, and 1510 c may operate in conjunction to determine a reference orientation that is based on the orientation of first viewer 1506 and the orientation of second viewer 1536. For instance, the reference orientation may be an average of the orientation of first viewer 1506 and the orientation of second viewer 1536. In further accordance with this embodiment, processing circuitry 1526 may control driver circuitry 1524 to modify the first spatial orientation of the first audio content 1532 and the second spatial orientation of the second audio content 1542 in accordance with the reference orientation to provide resulting audio content that is based on the reference orientation. Accordingly, audio system 1528 may provide the resulting audio content to both first viewer 1506 and second viewer 1536. First viewer 1506 and second viewer 1536 both may hear the resulting audio content. It will be recognized that audio system 1528 may provide first audio content 1532 and second audio content 1542 via any combination of viewer-specific and/or viewer-agnostic speakers.
  • Reference information generation circuitry 1510 a and 1510 c comprise components of media system 1502 that operate in conjunction to produce first reference information concerning at least one positional characteristic of first viewing reference 1508 (i.e., orientation) of first viewer 1506 with respect to adaptable screen assembly 1522. Reference information generation circuitry 1510 b and 1510 c comprise components of media system 1502 that operate in conjunction to produce second reference information concerning at least one positional characteristic of second viewing reference 1538 (i.e., orientation) of second viewer 1536 with respect to adaptable screen assembly 1522. First viewing reference 1508 comprises one or more positional characteristics that affect how first three-dimensional visual content displayed via adaptable screen assembly 1522 and/or first audio content 1532 provided by audio system 1528 will be perceived by first viewer 1506. Second viewing reference 1538 comprises one or more positional characteristics that affect how second three-dimensional visual content simultaneously displayed via adaptable screen assembly 1522 and/or second audio content 1542 simultaneously provided by audio system 1528 will be perceived by second viewer 1536. Example positional characteristics of a viewing reference were described above.
  • The first reference information produced by reference information generation circuitry 1510 a and 1510 c is provided to processing circuitry 1526. Based on at least the first reference information, processing circuitry 1526 issues one or more first control signals to driver circuitry 1524 to modify at least one of the display characteristics of adaptable screen assembly 1522 and/or to modify the spatial orientation of first audio content 1532. Such modifications may be performed, for example, to deliver the first three-dimensional visual content and/or first audio content 1532 to first viewer 1506 in accordance with one or more positional characteristics of first viewing reference 1508. The second reference information produced by reference information generation circuitry 1510 b and 1510 c is also provided to processing circuitry 1526. Based on at least the second reference information, processing circuitry 1526 issues one or more second control signals to driver circuitry 1524 to modify at least one of the display characteristics of adaptable screen assembly 1522 and/or to modify the spatial orientation of second audio content 1542. Such modifications may be performed, for example, to deliver the second three-dimensional visual content and/or second audio content 1542 to second viewer 1536 in accordance with one or more positional characteristics of second viewing reference 1538.
  • Reference information generation circuitry 1510 a is intended to represent viewer-located circuitry that is situated on or near first viewer 1506 while reference information generation circuitry 1510 b is intended to represent viewer-located circuitry that is situated on or near second viewer 1536. Reference information generation circuitry 1510 a and 1510 b may include any of the components of reference information generation circuitry 110 a described above in reference to FIGS. 1 and 3-8.
  • Reference information generation circuitry 1510 c is intended to represent circuitry that is not viewer-located. Reference information generation circuitry 1510 c is configured to interact with reference information generation circuitry 1510 a to determine one or more positional characteristics of first viewing reference 1508. Such interaction may involve for example, implementing any of the techniques described above in reference to FIGS. 1 and 3-8 to estimate a location, head orientation and/or point of gaze of viewer 1506. Reference information generation circuitry 1510 c is further configured to interact with reference information generation circuitry 1510 b to determine one or more positional characteristics of second viewing reference 1538. Such interaction may involve for example, implementing any of the techniques described above in reference to FIGS. 1 and 3-8 to estimate a location, head orientation and/or point of gaze of viewer 1536. By operating in this manner, reference information generation circuitry 1510 a, 1510 b and 1510 c can produce reference information about both viewing references 1508 and 1538. Such information can be used by control circuitry to optimize the delivery of the first three-dimensional content to first viewer 1506 and the simultaneous delivery of the second three-dimensional content to second viewer 1536.
  • FIG. 16 depicts a flowchart 1600 of a method for simultaneously presenting first three-dimensional content to a first viewer and second three-dimensional content to a second viewer in accordance with an embodiment. The method of flowchart 1600 will be described herein with continued reference to media system 1502 of FIG. 15. However, the method is not limited to that system.
  • As shown in FIG. 16, the method of flowchart 1600 begins at step 1602, in which first circuitry at least assists in producing first reference information corresponding to at least one positional characteristic of a first viewing reference of a first viewer. The first circuitry may comprise, for example, reference information generation circuitry 1510 a and/or reference information generation circuitry 1510 c as described above. The first viewing reference of the first viewer may comprise any of a number of positional characteristics that affect how first three-dimensional visual content displayed via an adaptable screen assembly and/or associated first audio that is provided by an audio system will be perceived by the first viewer. As noted above, such positional characteristics may include, for example and without limitation, a position or location of the first viewer relative to the adaptable screen assembly, a head orientation of the first viewer and/or a point of gaze of the first viewer. The first reference information may be produced using any of the approaches previously described herein as well as additional approaches not described herein.
  • At step 1604, the first reference information produced during step 1602 is provided to second circuitry.
  • At step 1606, the second circuitry issues one or more first control signals to cause modification of at least one of one or more adaptable display characteristics of an adaptable screen assembly based on at least the first reference information. The second circuitry may comprise, for example, processing circuitry 1526 as described above. The one or more adaptable display characteristics may include, but are not limited to, a configuration of one or more adaptable light manipulators that form part of the adaptable screen assembly, a manner in which images are mapped to display pixels in a pixel array that forms part of the adaptable screen assembly, a distance between such pixel array and such adaptable light manipulator(s), an angular orientation of such adaptable light manipulator(s), and the like.
  • At step 1608, the second circuitry issues one or more second control signals to cause modification of a spatial orientation of first audio content for at least the first viewer based on at least the first reference information.
  • At step 1610, third circuitry at least assists in producing second reference information corresponding to at least one positional characteristic of a second viewing reference of a second viewer. The third circuitry may comprise, for example, reference information generation circuitry 1510 b and/or reference information generation circuitry 1510 c as described above. The second viewing reference of the second viewer may comprise any of a number of positional characteristics that affect how second three-dimensional content that is simultaneously displayed with the first three-dimensional content by the adaptable screen assembly and/or associated second audio that is provided by the audio system will be perceived by the second viewer. As noted above, such positional characteristics may include, for example and without limitation, a position or location of the second viewer relative to the adaptable screen assembly, a head orientation of the second viewer and/or a point of gaze of the second viewer. The second reference information may be produced using any of the approaches previously described herein as well as additional approaches not described herein.
  • At step 1612, the second reference information produced during step 1608 is provided to the second circuitry.
  • At step 1614, the second circuitry issues one or more third control signals to cause modification of at least one of the one or more adaptable display characteristics of the adaptable screen assembly based on at least the second reference information.
  • At step 1616, the second circuitry issues one or more fourth control signals to cause modification of a spatial orientation of second audio content for at least the second viewer based on at least the second reference information.
  • By way of further illustration, FIG. 17 depicts a flowchart 1700 of a method for delivering audio content to first and second viewers of a display capable of simultaneously presenting first video content to the first viewer and second video content to the second viewer. As shown in FIG. 17, the method of flowchart 1700 begins at step 1702 in which first audio content is delivered to first viewer-located circuitry (e.g., reference information generation circuitry 1510 a of FIG. 15) carried by the first viewer of the display, the first audio content being associated with the first video content. For instance, the first viewer-located circuitry may include first viewer-specific speakers of an audio system (e.g., audio system 1528).
  • At step 1704, second audio content is simultaneously delivered to second viewer-located circuitry (e.g., reference information generation circuitry 1510 b of FIG. 15) carried by the second viewer of the display, the second audio content being associated with the second video content. For instance, the second viewer-located circuitry may include second viewer-specific speakers of the audio system.
  • By providing separate channels for routing audio to first and second viewers 1506 and 1536, other benefits may be achieved. For example, first and second viewers 1506 and 1536 may each view the same video content but customize the corresponding audio content that is being delivered thereto in one or more ways including but not limited to content (e.g., choice of language) and audio settings (e.g., volume, mono vs. stereo sound, two-dimensional vs. three-dimensional audio, equalizer settings or the like). The customizations applied to the audio content delivered to first viewer 1506 need not be applied to the audio content delivered to second viewer 1536 and vice versa.
  • III. Exemplary Media System
  • FIG. 18 is a block diagram of an example implementation of a media system, such as media system 102 described above in reference to FIG. 1 and media system 1502 described above in reference to FIG. 15, in accordance with an embodiment. As shown in FIG. 18, media system 1800 generally comprises processing circuitry 1802, driver circuitry 1804, a screen assembly 1806, reference information generation circuitry 1808, and audio system 1858.
  • As shown in FIG. 18, processing circuitry 1802 includes a processing unit 1814, which may comprise one or more general-purpose or special-purpose processors or one or more processing cores. Processing unit 1814 is connected to a communication infrastructure 1812, such as a communication bus. Processing circuitry 1802 may also include a primary or main memory (not shown in FIG. 18), such as random access memory (RAM), that is connected to communication infrastructure 1812. The main memory may have control logic stored thereon for execution by processing unit 1814 as well as data stored thereon that may be input to or output by processing unit 1814 during execution of such control logic.
  • Processing circuitry 1802 may also include one or more secondary storage devices (not shown in FIG. 18) that are connected to communication infrastructure 1812, including but not limited to a hard disk drive, a removable storage drive (such as an optical disk drive, a floppy disk drive, a magnetic tape drive, or the like), or an interface for communicating with a removable storage unit such as an interface for communicating with a memory card, memory stick or the like. Each of these secondary storage devices provide an additional means for storing control logic for execution by processing unit 1814 as well as data that may be input to or output by processing unit 1814 during execution of such control logic.
  • Processing circuitry 1802 further includes a user input interface 1818, a reference information generation circuitry interface (I/F) 1816, and a media interface 1820. User input interface 1818 is intended to generally represent any type of interface that may be used to receive user input, including but not limited to a remote control device, a traditional computer input device such as a keyboard or mouse, a touch screen, a gamepad or other type of gaming console input device, or one or more sensors including but not limited to video cameras, microphones and motion sensors.
  • Reference information generation circuitry interface 1816 is an interface that is suitable for connection to reference information generation circuitry 1808 and that allows processing circuitry 1802 to communicate therewith. As discussed extensively above, reference information generation circuitry 1808 comprises circuitry that is configured to generate information about one or more positional characteristics of one or more viewing references associated with one or more viewers of media system 1800.
  • Media interface 1820 is intended to represent any type of interface that is capable of receiving media content such as video content or image content. In certain implementations, media interface 1820 may comprise an interface for receiving media content from a remote source such as a broadcast media server, an on-demand media server, or the like. In such implementations, media interface 1820 may comprise, for example and without limitation, a wired or wireless internet or intranet connection, a satellite interface, a fiber interface, a coaxial cable interface, or a fiber-coaxial cable interface. Media interface 1820 may also comprise an interface for receiving media content from a local source such as a DVD or Blu-Ray® disc player, a personal computer, a personal media player, smart phone, or the like. Media interface 1820 may be capable of retrieving video content from multiple sources.
  • Processing circuitry 1802 further includes a communication interface 1822. Communication interface 1822 enables processing circuitry 1802 to send control signals via a communication medium 1852 to another communication interface 1830 within driver circuitry 1804, thereby enabling processing circuitry 1802 to control the operation of driver circuitry 1804. Communication medium 1852 may comprise any kind of wired or wireless communication medium suitable for transmitting such control signals.
  • As shown in FIG. 18, driver circuitry 1804 includes the aforementioned communication interface 1830 as well as pixel array driver circuitry 1832, adaptable light manipulator driver circuitry 1834, and speaker driver circuitry 1854. Driver circuitry 1804 also optionally includes light generator driver circuitry 1836. Each of pixel array driver circuitry 1832, adaptable light manipulator driver circuitry 1834, and light generator driver circuitry 1836 is configured to receive control signals from processing circuitry 1802 (via the link between communication interface 1822 and communication interface 1830) and, responsive thereto, to send selected drive signals to a corresponding hardware element within screen assembly 1806, the drive signals causing the corresponding hardware element to operate in a particular manner. In particular, pixel array driver circuitry 1832 is configured to send selected drive signals to a pixel array 1842 within screen assembly 1806, adaptable light manipulator driver circuitry 1834 is configured to send selected drive signals to an adaptable light manipulator 1844 within screen assembly 1806, and optional light generator driver circuitry 1836 is configured to send selected drive signals to an optional light generator 1846 within screen assembly 1806.
  • Driver circuitry 1804 also includes speaker driver circuitry 1854, which is configured to receive control signals from processing circuitry 1802 (via the link between communication interface 1822 and communication interface 1830) and, responsive thereto, to send selected drive signals to speakers 1860 within audio system 1858, the drive signals causing speakers 1860 to provide audio content having a specified spatial configuration.
  • In one example mode of operation, processing unit 1814 operates pursuant to control logic to receive visual and/or audio content via media interface 1820 and to generate control signals necessary to cause driver circuitry 1804 to render the visual content to screen assembly 1806 and/or the audio content to audio system 1858 in accordance with a selected viewing configuration. The viewing configuration may be selected based on, for example, reference information generated by and received from reference information generation circuitry 1808. The control logic that is executed by processing unit 1814 may be retrieved, for example, from a primary memory or a secondary storage device connected to processing unit 1814 via communication infrastructure 1812 as discussed above. The control logic may also be retrieved from some other local or remote source. Where the control logic is stored on a computer readable medium, that computer readable medium may be referred to herein as a computer program product.
  • Among other features, driver circuitry 1804 may be controlled in a manner described in aforementioned, incorporated U.S. patent application Ser. No. ______ (Attorney Docket No. A05.01240000), filed on even date herewith and entitled “Coordinated Driving of Adaptable Light Manipulator, Backlighting and Pixel Array in Support of Adaptable 2D and 3D Displays” (the entirety of which is incorporated by reference herein) to send coordinated drive signals necessary for displaying two-dimensional content and three-dimensional content via screen assembly 1806. In certain operating modes, such content may be simultaneously displayed via different display regions of screen assembly 1806. The manner in which pixel array 1842, adaptable light manipulator 1844 (e.g., an adaptable parallax barrier), and light generator 1846 may be manipulated in a coordinated fashion to perform this function was described in the patent application referenced immediately above. It will be recognized that speakers 1860 may be controlled in like manner to provide a coordinated three-dimensional audio and visual experience. Note that in accordance with certain implementations (e.g., implementations in which pixel array comprises a OLED/PLED pixel array), screen assembly 1806 need not include light generator 1846.
  • In one embodiment, at least part of the function of generating control signals necessary to cause pixel array 1842, adaptable light manipulator 1844 and light generator 1846 to render visual content and/or to cause speakers 1860 to render audio content in accordance with a selected viewing configuration is performed by drive signal processing circuitry 1838 which is integrated within driver circuitry 1804. Such circuitry may operate, for example, in conjunction with and/or under the control of processing unit 1814 to generate the necessary control signals.
  • In certain implementations, processing circuitry 1802, driver circuitry 1804, screen elements 1806, and potentially at least some portion of audio system 1858 are all included within a single housing. For example and without limitation, all these elements may exist within a television, a laptop computer, a tablet computer, or a telephone. In accordance with such an implementation, link 1850 formed between communication interfaces 1822 and 1830 may be replaced by a direct connection between driver circuitry 1804 and communication infrastructure 1812. In an alternate implementation, processing circuitry 1802 and potentially at least some portion of audio system 1858 are disposed within a first housing, such as set top box or personal computer, and driver circuitry 1804, screen assembly 1806, and potentially at least some portion of audio system 1858 are disposed within a second housing, such as a television or computer monitor. In yet another alternate implementation, audio system 1858 may be disposed within at least one third housing, such as a plurality of speaker assemblies located around a viewing area. The set top box may be any type of set top box including but not limited to fiber, Internet, cable, satellite, or terrestrial digital.
  • The various processing circuitry elements 126, 1526, and 1802 shown in underlying figures may exist in whole or in part in one or more of any one or more media environment devices. Examples of a media environment device include but are not limited to a home set top box, a location support unit, a gateway, an access point, a media player (e.g., a DVD, CD, or Blu-Ray player), a projection system, a display device (e.g., a television, a monitor, a personal computer, a phone, etc.), etc.
  • Such processing circuitry attempts to identify a synchronized 3D viewing and listening experience based on reference information concerning at least one positional characteristic of a viewing and listening reference (i.e., orientation) of a viewer, as described above. Identification of a synchronized 3D viewing and listening experience may be performed in a variety of ways. For example, appropriate camera views may be selected to be displayed. For instance, a 3D8 data set might drive a 3D2 display wherein particular pairs of camera perspectives/views are selected based on the viewer's reference information. In another example, interpolation might be applied to generate interpolated views. For instance, interpolation may be applied to generate interpolated views in 3D2 data sets to produce perhaps four distinct camera views (wherein two of the camera views are interpolations). Lastly for the video, the light manipulator itself in a 3D8 display might provide the 3D visual experience with a changing eyes reference point.
  • A 3D audio experience may change in synch with a changing 3D visual experience in a variety of ways. For example, the processing circuitry might smoothly migrate between a plurality of audio channel sets that each correspond to one of the plurality of visual reference points. An audio channel set may be, for example, a Dolby 5.1 set, wherein each set is captured or produced for a particular reference point that may correspond to that of the various camera views.
  • Alternatively, or in addition, audio sets for each specified noise source having a 3D origin can be captured or produced independently. For example, a general background set (e.g., music, etc.) may be captured/produced from a single reference point, a distant explosion may be captured/produced from 4 reference points, and local speech from an on-screen relatively close actor might be captured/produced with 8 reference points—all in a 3D2 or a 3D4 environment. Thereafter, as a viewer's reference changes, both a different 3D visual experience and a different 3D listening experience (i.e., the 3D sensory environment) correspondingly change.
  • IV. Conclusion
  • While various embodiments of the present invention have been described above, it should be understood that they have been presented by way of example only, and not limitation. It will be apparent to persons skilled in the relevant art that various changes in form and detail can be made therein without departing from the spirit and scope of the invention. Thus, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.

Claims (21)

1. A method supporting a viewer of a three-dimensional visual presentation within a premises, the viewer having a first orientation within the premises, the method comprising:
identifying at least one positional characteristic relating to the first orientation of the viewer within the premises;
delivering a video output, tailored based at least in part on the at least one positional characteristic, for the three-dimensional visual presentation to the viewer in the first orientation; and
delivering an audio output, tailored based at least in part on the at least one positional characteristic, to audibly supplement the video output for the viewer in the first orientation.
2. The method of claim 1, wherein the at least one positional characteristic comprising a relative location of the viewer.
3. The method of claim 1, wherein the at least one positional characteristic comprising an orientation of a head of the viewer.
4. The method of claim 1, further comprising tailoring the audio output based at least in part on the at least one positional characteristic.
5. The method of claim 4, wherein the tailoring of the audio output comprising selecting the audio output from a plurality of audio outputs.
6. The method of claim 4, wherein the tailoring of the audio output comprising generating the audio output.
7. The method of claim 1, wherein the audio output is tailored by changing at least an output amplitude.
8. The method of claim 1, wherein the audio output is tailored by adding a delay.
9. Media circuitry supporting a viewer of a three-dimensional visual presentation within a premises, the viewer having a first orientation within the premises, the media circuitry comprising:
first circuitry that identifies at least one positional characteristic relating to the first orientation of the viewer within the premises;
second circuitry that delivers a video output for the three-dimensional visual presentation to the viewer in the first orientation; and
third circuitry that delivers an audio output that is tailored based at least in part on the at least one positional characteristic, the audio output supplementing the video output for the viewer in the first orientation.
10. The media circuitry of claim 9, wherein the at least one positional characteristic comprising a relative location of the viewer.
11. The media circuitry of claim 9, wherein the at least one positional characteristic comprising an orientation of a head of the viewer.
12. The media circuitry of claim 9, further comprising fourth circuitry that performs the tailoring of the audio output.
13. The media circuitry of claim 12, wherein the fourth circuitry performs the tailoring by selecting the audio output from a plurality of audio outputs.
14. The media circuitry of claim 12, wherein the fourth circuitry performs the tailoring by changing at least an output amplitude.
15. The media circuitry of claim 12, wherein the fourth circuitry performs the tailoring by adding a delay.
16. The media circuitry of claim 12, wherein the fourth circuitry performs the tailoring based at least in part on image data captured within the premises.
17. The media circuitry of claim 12, wherein the fourth circuitry performs the tailoring based at least in part on captured audio data captured within the premises.
18. The media circuitry of claim 12, further comprising fifth circuitry that is carried by the viewer, the fifth circuitry assisting the first circuitry in the identification of the at least one positional characteristic.
19. A method relating to delivery of an audio experience for ears of a listener via a plurality of speakers, the audio experience to be established based on audio output that is based on audio content, the audio content having a spatial orientation, the listener being at a first location, the ears of the listener being in either a first orientation or a second orientation with respect to the plurality of speakers, the method comprising:
detecting the ears of the listener being in the first orientation with respect to the plurality of speakers;
delivering first audio output to attempt to establish, with the spatial orientation of the audio content, the audio experience for the ears of the listener in the first orientation;
detecting the ears of the listener being in the second orientation with respect to the plurality of speakers; and
delivering second audio output to attempt to establish, with the spatial orientation of the audio content, the audio experience for the ears of the listener in the second orientation.
20. The method of claim 19, further comprising:
detecting a move by the listener to a second location; and
delivering third audio output to establish, with the spatial orientation of the audio content, the audio experience for the ears of the listener at the second location.
21. The method of claim 19, further comprising:
delivering a three-dimensional visual presentation that is tailored based on the listener being at the first location.
US12/982,377 2009-12-31 2010-12-30 3d audio delivery accompanying 3d display supported by viewer/listener position and orientation tracking Abandoned US20110157327A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/982,377 US20110157327A1 (en) 2009-12-31 2010-12-30 3d audio delivery accompanying 3d display supported by viewer/listener position and orientation tracking

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US29181809P 2009-12-31 2009-12-31
US30311910P 2010-02-10 2010-02-10
US12/982,377 US20110157327A1 (en) 2009-12-31 2010-12-30 3d audio delivery accompanying 3d display supported by viewer/listener position and orientation tracking

Publications (1)

Publication Number Publication Date
US20110157327A1 true US20110157327A1 (en) 2011-06-30

Family

ID=43797724

Family Applications (27)

Application Number Title Priority Date Filing Date
US12/774,307 Active 2032-01-14 US8964013B2 (en) 2009-12-31 2010-05-05 Display with elastic light manipulator
US12/774,225 Abandoned US20110157322A1 (en) 2009-12-31 2010-05-05 Controlling a pixel array to support an adaptable light manipulator
US12/845,461 Active 2031-10-30 US8767050B2 (en) 2009-12-31 2010-07-28 Display supporting multiple simultaneous 3D views
US12/845,409 Abandoned US20110157696A1 (en) 2009-12-31 2010-07-28 Display with adaptable parallax barrier
US12/845,440 Abandoned US20110157697A1 (en) 2009-12-31 2010-07-28 Adaptable parallax barrier supporting mixed 2d and stereoscopic 3d display regions
US12/982,156 Active 2035-11-09 US9654767B2 (en) 2009-12-31 2010-12-30 Programming architecture supporting mixed two and three dimensional displays
US12/982,140 Abandoned US20110161843A1 (en) 2009-12-31 2010-12-30 Internet browser and associated content definition supporting mixed two and three dimensional displays
US12/982,062 Active 2032-06-13 US8687042B2 (en) 2009-12-31 2010-12-30 Set-top box circuitry supporting 2D and 3D content reductions to accommodate viewing environment constraints
US12/982,377 Abandoned US20110157327A1 (en) 2009-12-31 2010-12-30 3d audio delivery accompanying 3d display supported by viewer/listener position and orientation tracking
US12/982,248 Abandoned US20110157315A1 (en) 2009-12-31 2010-12-30 Interpolation of three-dimensional video content
US12/982,088 Active 2032-01-06 US9066092B2 (en) 2009-12-31 2010-12-30 Communication infrastructure including simultaneous video pathways for multi-viewer support
US12/982,069 Active 2033-05-07 US8922545B2 (en) 2009-12-31 2010-12-30 Three-dimensional display system with adaptation based on viewing reference of viewer(s)
US12/982,309 Active 2033-05-02 US9204138B2 (en) 2009-12-31 2010-12-30 User controlled regional display of mixed two and three dimensional content
US12/982,362 Active 2031-02-05 US9049440B2 (en) 2009-12-31 2010-12-30 Independent viewer tailoring of same media source content via a common 2D-3D display
US12/982,020 Abandoned US20110157257A1 (en) 2009-12-31 2010-12-30 Backlighting array supporting adaptable parallax barrier
US12/982,047 Abandoned US20110157330A1 (en) 2009-12-31 2010-12-30 2d/3d projection system
US12/982,330 Abandoned US20110157326A1 (en) 2009-12-31 2010-12-30 Multi-path and multi-source 3d content storage, retrieval, and delivery
US12/982,199 Active 2032-09-27 US8988506B2 (en) 2009-12-31 2010-12-30 Transcoder supporting selective delivery of 2D, stereoscopic 3D, and multi-view 3D content from source video
US12/982,053 Abandoned US20110157309A1 (en) 2009-12-31 2010-12-30 Hierarchical video compression supporting selective delivery of two-dimensional and three-dimensional video content
US12/982,173 Active 2033-08-22 US9143770B2 (en) 2009-12-31 2010-12-30 Application programming interface supporting mixed two and three dimensional displays
US12/982,273 Active 2032-08-13 US9979954B2 (en) 2009-12-31 2010-12-30 Eyewear with time shared viewing supporting delivery of differing content to multiple viewers
US12/982,212 Active 2032-04-05 US9013546B2 (en) 2009-12-31 2010-12-30 Adaptable media stream servicing two and three dimensional content
US12/982,124 Active 2033-02-08 US9124885B2 (en) 2009-12-31 2010-12-30 Operating system supporting mixed 2D, stereoscopic 3D and multi-view 3D displays
US12/982,031 Active 2032-12-14 US9019263B2 (en) 2009-12-31 2010-12-30 Coordinated driving of adaptable light manipulator, backlighting and pixel array in support of adaptable 2D and 3D displays
US14/504,095 Abandoned US20150015668A1 (en) 2009-12-31 2014-10-01 Three-dimensional display system with adaptation based on viewing reference of viewer(s)
US14/616,130 Abandoned US20150156473A1 (en) 2009-12-31 2015-02-06 Transcoder supporting selective delivery of 2d, stereoscopic 3d, and multi-view 3d content from source video
US14/723,922 Abandoned US20150264341A1 (en) 2009-12-31 2015-05-28 Communication infrastructure including simultaneous video pathways for multi-viewer support

Family Applications Before (8)

Application Number Title Priority Date Filing Date
US12/774,307 Active 2032-01-14 US8964013B2 (en) 2009-12-31 2010-05-05 Display with elastic light manipulator
US12/774,225 Abandoned US20110157322A1 (en) 2009-12-31 2010-05-05 Controlling a pixel array to support an adaptable light manipulator
US12/845,461 Active 2031-10-30 US8767050B2 (en) 2009-12-31 2010-07-28 Display supporting multiple simultaneous 3D views
US12/845,409 Abandoned US20110157696A1 (en) 2009-12-31 2010-07-28 Display with adaptable parallax barrier
US12/845,440 Abandoned US20110157697A1 (en) 2009-12-31 2010-07-28 Adaptable parallax barrier supporting mixed 2d and stereoscopic 3d display regions
US12/982,156 Active 2035-11-09 US9654767B2 (en) 2009-12-31 2010-12-30 Programming architecture supporting mixed two and three dimensional displays
US12/982,140 Abandoned US20110161843A1 (en) 2009-12-31 2010-12-30 Internet browser and associated content definition supporting mixed two and three dimensional displays
US12/982,062 Active 2032-06-13 US8687042B2 (en) 2009-12-31 2010-12-30 Set-top box circuitry supporting 2D and 3D content reductions to accommodate viewing environment constraints

Family Applications After (18)

Application Number Title Priority Date Filing Date
US12/982,248 Abandoned US20110157315A1 (en) 2009-12-31 2010-12-30 Interpolation of three-dimensional video content
US12/982,088 Active 2032-01-06 US9066092B2 (en) 2009-12-31 2010-12-30 Communication infrastructure including simultaneous video pathways for multi-viewer support
US12/982,069 Active 2033-05-07 US8922545B2 (en) 2009-12-31 2010-12-30 Three-dimensional display system with adaptation based on viewing reference of viewer(s)
US12/982,309 Active 2033-05-02 US9204138B2 (en) 2009-12-31 2010-12-30 User controlled regional display of mixed two and three dimensional content
US12/982,362 Active 2031-02-05 US9049440B2 (en) 2009-12-31 2010-12-30 Independent viewer tailoring of same media source content via a common 2D-3D display
US12/982,020 Abandoned US20110157257A1 (en) 2009-12-31 2010-12-30 Backlighting array supporting adaptable parallax barrier
US12/982,047 Abandoned US20110157330A1 (en) 2009-12-31 2010-12-30 2d/3d projection system
US12/982,330 Abandoned US20110157326A1 (en) 2009-12-31 2010-12-30 Multi-path and multi-source 3d content storage, retrieval, and delivery
US12/982,199 Active 2032-09-27 US8988506B2 (en) 2009-12-31 2010-12-30 Transcoder supporting selective delivery of 2D, stereoscopic 3D, and multi-view 3D content from source video
US12/982,053 Abandoned US20110157309A1 (en) 2009-12-31 2010-12-30 Hierarchical video compression supporting selective delivery of two-dimensional and three-dimensional video content
US12/982,173 Active 2033-08-22 US9143770B2 (en) 2009-12-31 2010-12-30 Application programming interface supporting mixed two and three dimensional displays
US12/982,273 Active 2032-08-13 US9979954B2 (en) 2009-12-31 2010-12-30 Eyewear with time shared viewing supporting delivery of differing content to multiple viewers
US12/982,212 Active 2032-04-05 US9013546B2 (en) 2009-12-31 2010-12-30 Adaptable media stream servicing two and three dimensional content
US12/982,124 Active 2033-02-08 US9124885B2 (en) 2009-12-31 2010-12-30 Operating system supporting mixed 2D, stereoscopic 3D and multi-view 3D displays
US12/982,031 Active 2032-12-14 US9019263B2 (en) 2009-12-31 2010-12-30 Coordinated driving of adaptable light manipulator, backlighting and pixel array in support of adaptable 2D and 3D displays
US14/504,095 Abandoned US20150015668A1 (en) 2009-12-31 2014-10-01 Three-dimensional display system with adaptation based on viewing reference of viewer(s)
US14/616,130 Abandoned US20150156473A1 (en) 2009-12-31 2015-02-06 Transcoder supporting selective delivery of 2d, stereoscopic 3d, and multi-view 3d content from source video
US14/723,922 Abandoned US20150264341A1 (en) 2009-12-31 2015-05-28 Communication infrastructure including simultaneous video pathways for multi-viewer support

Country Status (5)

Country Link
US (27) US8964013B2 (en)
EP (4) EP2357630A1 (en)
CN (3) CN102183840A (en)
HK (1) HK1161754A1 (en)
TW (3) TW201142356A (en)

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110102558A1 (en) * 2006-10-05 2011-05-05 Renaud Moliton Display device for stereoscopic display
US20110157326A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation Multi-path and multi-source 3d content storage, retrieval, and delivery
US20110164188A1 (en) * 2009-12-31 2011-07-07 Broadcom Corporation Remote control with integrated position, viewer identification and optical and audio test
US20110254934A1 (en) * 2010-04-16 2011-10-20 Samsung Electronics Co., Ltd. Display apparatus, 3d glasses, and display system including the same
US20120105610A1 (en) * 2010-11-03 2012-05-03 Samsung Electronics Co., Ltd. Method and apparatus for providing 3d effect in video device
US20120148055A1 (en) * 2010-12-13 2012-06-14 Samsung Electronics Co., Ltd. Audio processing apparatus, audio receiver and method for providing audio thereof
US20120300026A1 (en) * 2011-05-24 2012-11-29 William Allen Audio-Video Signal Processing
US20120307048A1 (en) * 2011-05-30 2012-12-06 Sony Ericsson Mobile Communications Ab Sensor-based placement of sound in video recording
EP2637416A1 (en) * 2012-03-06 2013-09-11 Alcatel Lucent A system and method for optimized streaming of variable multi-viewpoint media
US20130278732A1 (en) * 2012-04-24 2013-10-24 Mobitv, Inc. Control of perspective in multi-dimensional media
US20130328777A1 (en) * 2012-06-12 2013-12-12 Andrew Johnson System and methods for visualizing information
US20140036044A1 (en) * 2012-08-03 2014-02-06 Samsung Electronics Co., Ltd. Display apparatus which displays a plurality of content views, glasses apparatus which synchronizes with one of the content views, and methods thereof
CN103703772A (en) * 2011-07-18 2014-04-02 三星电子株式会社 Content playing method and apparatus
US8854531B2 (en) 2009-12-31 2014-10-07 Broadcom Corporation Multiple remote controllers that each simultaneously controls a different visual presentation of a 2D/3D display
US20140328505A1 (en) * 2013-05-02 2014-11-06 Microsoft Corporation Sound field adaptation based upon user tracking
US20150085076A1 (en) * 2013-09-24 2015-03-26 Amazon Techologies, Inc. Approaches for simulating three-dimensional views
US9047054B1 (en) * 2012-12-20 2015-06-02 Audible, Inc. User location-based management of content presentation
US20150316640A1 (en) * 2012-09-17 2015-11-05 Nokia Technologies Oy Method and apparatus for associating audio objects with content and geo-location
EP2963951A1 (en) * 2014-07-02 2016-01-06 Samsung Electronics Co., Ltd Method, user terminal, and audio system, for speaker location detection and level control using magnetic field
US9247286B2 (en) 2009-12-31 2016-01-26 Broadcom Corporation Frame formatting supporting mixed two and three dimensional video data communication
US9265458B2 (en) 2012-12-04 2016-02-23 Sync-Think, Inc. Application of smooth pursuit cognitive testing paradigms to clinical drug development
US9380976B2 (en) 2013-03-11 2016-07-05 Sync-Think, Inc. Optical neuroinformatics
WO2017156622A1 (en) * 2016-03-13 2017-09-21 Rising Sun Productions Limited Head-mounted audiovisual capture device
US20170366914A1 (en) * 2016-06-17 2017-12-21 Edward Stein Audio rendering using 6-dof tracking
EP3276982A1 (en) * 2016-07-28 2018-01-31 Canon Kabushiki Kaisha Information processing apparatus, information processing method, and program
US20180046431A1 (en) * 2016-08-10 2018-02-15 Qualcomm Incorporated Multimedia device for processing spatialized audio based on movement
US20180176545A1 (en) * 2016-11-25 2018-06-21 Nokia Technologies Oy Virtual reality display
EP3503579A1 (en) * 2017-12-20 2019-06-26 Nokia Technologies Oy Multi-camera device
US10609503B2 (en) 2018-04-08 2020-03-31 Dts, Inc. Ambisonic depth extraction
US10777057B1 (en) * 2017-11-30 2020-09-15 Amazon Technologies, Inc. Premises security system with audio simulating occupancy
US10802324B2 (en) 2017-03-14 2020-10-13 Boe Technology Group Co., Ltd. Double vision display method and device
US10932080B2 (en) 2019-02-14 2021-02-23 Microsoft Technology Licensing, Llc Multi-sensor object tracking for modifying audio
US20220068185A1 (en) * 2019-04-29 2022-03-03 Hewlett-Packard Development Company, L.P. Wireless configuration of display attribute
US11337020B2 (en) 2018-06-07 2022-05-17 Nokia Technologies Oy Controlling rendering of a spatial audio scene
US20220248088A1 (en) * 2018-04-11 2022-08-04 Alcacruz Inc. Digital media system
US11443487B2 (en) * 2017-06-30 2022-09-13 Nokia Technologies Oy Methods, apparatus, systems, computer programs for enabling consumption of virtual content for mediated reality

Families Citing this family (478)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8416217B1 (en) 2002-11-04 2013-04-09 Neonode Inc. Light-based finger gesture user interface
US9015736B2 (en) * 2005-12-29 2015-04-21 Rovi Guides, Inc. Systems and methods for episode tracking in an interactive media environment
PL2023812T3 (en) 2006-05-19 2017-07-31 The Queen's Medical Center Motion tracking system for real time adaptive imaging and spectroscopy
JP2008106185A (en) * 2006-10-27 2008-05-08 Shin Etsu Chem Co Ltd Method for adhering thermally conductive silicone composition, primer for adhesion of thermally conductive silicone composition and method for production of adhesion composite of thermally conductive silicone composition
US8570423B2 (en) * 2009-01-28 2013-10-29 Hewlett-Packard Development Company, L.P. Systems for performing visual collaboration between remotely situated participants
US8775023B2 (en) 2009-02-15 2014-07-08 Neanode Inc. Light-based touch controls on a steering wheel and dashboard
EP2256620A1 (en) * 2009-05-29 2010-12-01 Koninklijke Philips Electronics N.V. Picture selection method for modular lighting system
US8125418B2 (en) * 2009-06-26 2012-02-28 Global Oled Technology Llc Passive-matrix chiplet drivers for displays
WO2011021894A2 (en) * 2009-08-20 2011-02-24 Lg Electronics Inc. Image display apparatus and method for operating the same
JP5187639B2 (en) * 2009-08-28 2013-04-24 独立行政法人情報通信研究機構 3D display
US20110080472A1 (en) * 2009-10-02 2011-04-07 Eric Gagneraud Autostereoscopic status display
CA2776909A1 (en) * 2009-10-07 2011-04-14 Telewatch Inc. Video analytics method and system
WO2011072016A1 (en) * 2009-12-08 2011-06-16 Broadcom Corporation Method and system for handling multiple 3-d video formats
US20110143769A1 (en) * 2009-12-16 2011-06-16 Microsoft Corporation Dual display mobile communication device
EP2517399B1 (en) 2009-12-21 2017-01-25 Kik Interactive Inc. Systems and methods for accessing and controlling media stored remotely
US8684531B2 (en) * 2009-12-28 2014-04-01 Vision3D Technologies, Llc Stereoscopic display device projecting parallax image and adjusting amount of parallax
US20110187839A1 (en) * 2010-02-01 2011-08-04 VIZIO Inc. Frame based three-dimensional encoding method
US20110191328A1 (en) * 2010-02-03 2011-08-04 Vernon Todd H System and method for extracting representative media content from an online document
US20110202845A1 (en) * 2010-02-17 2011-08-18 Anthony Jon Mountjoy System and method for generating and distributing three dimensional interactive content
JP2011199853A (en) * 2010-02-23 2011-10-06 Panasonic Corp Three-dimensional image reproducing apparatus
DE102010009737A1 (en) * 2010-03-01 2011-09-01 Institut für Rundfunktechnik GmbH Method and arrangement for reproducing 3D image content
JP5462672B2 (en) * 2010-03-16 2014-04-02 株式会社ジャパンディスプレイ Display device and electronic device
US8634873B2 (en) * 2010-03-17 2014-01-21 Microsoft Corporation Mobile communication device having multiple, interchangeable second devices
KR101289269B1 (en) * 2010-03-23 2013-07-24 한국전자통신연구원 An apparatus and method for displaying image data in image system
KR20110109565A (en) * 2010-03-31 2011-10-06 삼성전자주식회사 Backlight unit, 3d display having the same and method of making 3d image
US10448083B2 (en) * 2010-04-06 2019-10-15 Comcast Cable Communications, Llc Streaming and rendering of 3-dimensional video
CN102449534B (en) * 2010-04-21 2014-07-02 松下电器产业株式会社 Three-dimensional video display device and three-dimensional video display method
US8667533B2 (en) * 2010-04-22 2014-03-04 Microsoft Corporation Customizing streaming content presentation
US9271052B2 (en) 2010-05-10 2016-02-23 Comcast Cable Communications, Llc Grid encoded media asset data
US9030536B2 (en) 2010-06-04 2015-05-12 At&T Intellectual Property I, Lp Apparatus and method for presenting media content
JP5510097B2 (en) * 2010-06-16 2014-06-04 ソニー株式会社 Signal transmission method, signal transmission device, and signal reception device
US9225975B2 (en) 2010-06-21 2015-12-29 Microsoft Technology Licensing, Llc Optimization of a multi-view display
US10089937B2 (en) * 2010-06-21 2018-10-02 Microsoft Technology Licensing, Llc Spatial and temporal multiplexing display
KR20110139497A (en) * 2010-06-23 2011-12-29 삼성전자주식회사 Display apparatus and method for displaying thereof
JP2012013980A (en) * 2010-07-01 2012-01-19 Sony Corp Stereoscopic display device and display drive circuit
US9049426B2 (en) * 2010-07-07 2015-06-02 At&T Intellectual Property I, Lp Apparatus and method for distributing three dimensional media content
US8670070B2 (en) * 2010-07-15 2014-03-11 Broadcom Corporation Method and system for achieving better picture quality in various zoom modes
US9032470B2 (en) 2010-07-20 2015-05-12 At&T Intellectual Property I, Lp Apparatus for adapting a presentation of media content according to a position of a viewing apparatus
US9232274B2 (en) 2010-07-20 2016-01-05 At&T Intellectual Property I, L.P. Apparatus for adapting a presentation of media content to a requesting device
JP2012034138A (en) * 2010-07-29 2012-02-16 Toshiba Corp Signal processing apparatus and signal processing method
KR20120020627A (en) * 2010-08-30 2012-03-08 삼성전자주식회사 Apparatus and method for image processing using 3d image format
WO2012028678A2 (en) * 2010-09-01 2012-03-08 Seereal Technologies S.A. Backplane device
JP5058316B2 (en) * 2010-09-03 2012-10-24 株式会社東芝 Electronic device, image processing method, and image processing program
US20120057007A1 (en) * 2010-09-03 2012-03-08 Satoshi Ishiguro Simplified Visual Screening Check on Television
JP5364666B2 (en) * 2010-09-13 2013-12-11 株式会社東芝 Stereoscopic image display apparatus, method and program
JP5368399B2 (en) * 2010-09-17 2013-12-18 富士フイルム株式会社 Electronic album generating apparatus, stereoscopic image pasting apparatus, operation control method thereof, and program thereof
EP2432218B1 (en) * 2010-09-20 2016-04-20 EchoStar Technologies L.L.C. Methods of displaying an electronic program guide
AU2011305445B2 (en) 2010-09-24 2017-03-16 The Board Of Trustees Of The Leland Stanford Junior University Direct capture, amplification and sequencing of target DNA using immobilized primers
CN103154801B (en) * 2010-10-13 2015-02-11 夏普株式会社 Display device
US10157526B2 (en) 2010-11-05 2018-12-18 Razberi Technologies, Inc. System and method for a security system
US8922658B2 (en) * 2010-11-05 2014-12-30 Tom Galvin Network video recorder system
KR101670927B1 (en) * 2010-11-05 2016-11-01 삼성전자주식회사 Display apparatus and method
US9860490B2 (en) 2010-11-05 2018-01-02 Tom Galvin Network video recorder system
US10477158B2 (en) 2010-11-05 2019-11-12 Razberi Technologies, Inc. System and method for a security system
US11082665B2 (en) 2010-11-05 2021-08-03 Razberi Secure Technologies, Llc System and method for a security system
EP2461238B1 (en) 2010-12-02 2017-06-28 LG Electronics Inc. Image display apparatus including an input device
US9172943B2 (en) * 2010-12-07 2015-10-27 At&T Intellectual Property I, L.P. Dynamic modification of video content at a set-top box device
KR101734285B1 (en) * 2010-12-14 2017-05-11 엘지전자 주식회사 Video processing apparatus of mobile terminal and method thereof
US8963694B2 (en) * 2010-12-17 2015-02-24 Sony Corporation System and method for remote controlled device selection based on device position data and orientation data of a user
US20120154559A1 (en) * 2010-12-21 2012-06-21 Voss Shane D Generate Media
US9386294B2 (en) * 2011-01-05 2016-07-05 Google Technology Holdings LLC Method and apparatus for 3DTV image adjustment
US20120178380A1 (en) * 2011-01-07 2012-07-12 Microsoft Corporation Wireless Communication Techniques
US8643684B2 (en) * 2011-01-18 2014-02-04 Disney Enterprises, Inc. Multi-layer plenoptic displays that combine multiple emissive and light modulating planes
TW201232280A (en) * 2011-01-20 2012-08-01 Hon Hai Prec Ind Co Ltd System and method for sharing desktop information
KR20120088467A (en) * 2011-01-31 2012-08-08 삼성전자주식회사 Method and apparatus for displaying partial 3d image in 2d image disaply area
JP5632764B2 (en) * 2011-02-02 2014-11-26 セイコーインスツル株式会社 Stereoscopic image display device
US20120202187A1 (en) * 2011-02-03 2012-08-09 Shadowbox Comics, Llc Method for distribution and display of sequential graphic art
US10083639B2 (en) * 2011-02-04 2018-09-25 Seiko Epson Corporation Control device for controlling image display device, head-mounted display device, image display system, control method for the image display device, and control method for the head-mounted display device
US8724467B2 (en) 2011-02-04 2014-05-13 Cisco Technology, Inc. System and method for managing congestion in a network environment
TWI569041B (en) 2011-02-14 2017-02-01 半導體能源研究所股份有限公司 Display device
US8630247B2 (en) * 2011-02-15 2014-01-14 Cisco Technology, Inc. System and method for managing tracking area identity lists in a mobile network environment
WO2012111427A1 (en) 2011-02-16 2012-08-23 Semiconductor Energy Laboratory Co., Ltd. Display device
US9035860B2 (en) 2011-02-16 2015-05-19 Semiconductor Energy Laboratory Co., Ltd. Display device
US9443455B2 (en) 2011-02-25 2016-09-13 Semiconductor Energy Laboratory Co., Ltd. Display device having a plurality of pixels
KR101852428B1 (en) * 2011-03-09 2018-04-26 엘지전자 주식회사 Mobile twrminal and 3d object control method thereof
US9558687B2 (en) 2011-03-11 2017-01-31 Semiconductor Energy Laboratory Co., Ltd. Display device and method for driving the same
US9578299B2 (en) 2011-03-14 2017-02-21 Qualcomm Incorporated Stereoscopic conversion for shader based graphics content
JP5766479B2 (en) * 2011-03-25 2015-08-19 京セラ株式会社 Electronic device, control method, and control program
JP5730091B2 (en) * 2011-03-25 2015-06-03 株式会社ジャパンディスプレイ Display panel, display device and electronic device
JP5092033B2 (en) * 2011-03-28 2012-12-05 株式会社東芝 Electronic device, display control method, and display control program
JP2012205285A (en) * 2011-03-28 2012-10-22 Sony Corp Video signal processing apparatus and video signal processing method
WO2012138539A2 (en) * 2011-04-08 2012-10-11 The Regents Of The University Of California Interactive system for collecting, displaying, and ranking items based on quantitative and textual input from multiple participants
US8988512B2 (en) * 2011-04-14 2015-03-24 Mediatek Inc. Method for adjusting playback of multimedia content according to detection result of user status and related apparatus thereof
JP5161998B2 (en) * 2011-04-19 2013-03-13 株式会社東芝 Information processing apparatus, information processing method, and program
JP5162000B2 (en) * 2011-04-19 2013-03-13 株式会社東芝 Information processing apparatus, information processing method, and program
JP5161999B2 (en) * 2011-04-19 2013-03-13 株式会社東芝 Electronic device, display control method, and display control program
WO2012150940A1 (en) * 2011-05-05 2012-11-08 Empire Technology Development Llc Lenticular directional display
US20120287115A1 (en) * 2011-05-10 2012-11-15 Ding Junjie Method for generating image frames
KR20120126458A (en) * 2011-05-11 2012-11-21 엘지전자 주식회사 Method for processing broadcasting signal and display device thereof
WO2012156778A1 (en) * 2011-05-13 2012-11-22 Sony Ericsson Mobile Communications Ab Adjusting parallax barriers
US9420259B2 (en) * 2011-05-24 2016-08-16 Comcast Cable Communications, Llc Dynamic distribution of three-dimensional content
JP6050941B2 (en) * 2011-05-26 2016-12-21 サターン ライセンシング エルエルシーSaturn Licensing LLC Display device and method, and program
US9442562B2 (en) * 2011-05-27 2016-09-13 Dolby Laboratories Licensing Corporation Systems and methods of image processing that adjust for viewer position, screen size and viewing distance
CN103262551B (en) * 2011-06-01 2015-12-09 松下电器产业株式会社 Image processor, dispensing device, image processing system, image treatment method, sending method and integrated circuit
JP2012253543A (en) * 2011-06-02 2012-12-20 Seiko Epson Corp Display device, control method of display device, and program
JP5770018B2 (en) * 2011-06-03 2015-08-26 任天堂株式会社 Display control program, display control apparatus, display control method, and display control system
US9420268B2 (en) 2011-06-23 2016-08-16 Lg Electronics Inc. Apparatus and method for displaying 3-dimensional image
US9445046B2 (en) 2011-06-24 2016-09-13 At&T Intellectual Property I, L.P. Apparatus and method for presenting media content with telepresence
US9602766B2 (en) 2011-06-24 2017-03-21 At&T Intellectual Property I, L.P. Apparatus and method for presenting three dimensional objects with telepresence
JP5890517B2 (en) * 2011-06-24 2016-03-22 トムソン ライセンシングThomson Licensing Method and device for delivering 3D content
US9030522B2 (en) 2011-06-24 2015-05-12 At&T Intellectual Property I, Lp Apparatus and method for providing media content
KR101772458B1 (en) * 2011-06-28 2017-08-30 엘지전자 주식회사 Display device and method for controlling thereof
US20130265300A1 (en) * 2011-07-03 2013-10-10 Neorai Vardi Computer device in form of wearable glasses and user interface thereof
JP2013015779A (en) * 2011-07-06 2013-01-24 Sony Corp Display control device, display control method, and computer program
US8988411B2 (en) 2011-07-08 2015-03-24 Semiconductor Energy Laboratory Co., Ltd. Display device
US9137522B2 (en) * 2011-07-11 2015-09-15 Realtek Semiconductor Corp. Device and method for 3-D display control
US9294752B2 (en) * 2011-07-13 2016-03-22 Google Technology Holdings LLC Dual mode user interface system and method for 3D video
KR101925495B1 (en) 2011-07-15 2018-12-05 가부시키가이샤 한도오따이 에네루기 켄큐쇼 Display device and method for driving the display device
US8587635B2 (en) 2011-07-15 2013-11-19 At&T Intellectual Property I, L.P. Apparatus and method for providing media services with telepresence
KR20130010834A (en) * 2011-07-19 2013-01-29 가부시키가이샤 한도오따이 에네루기 켄큐쇼 Display device
JP2013038454A (en) * 2011-08-03 2013-02-21 Sony Corp Image processor, method, and program
JP2013038504A (en) 2011-08-04 2013-02-21 Sony Corp Imaging device, image processing method and program
JP5815326B2 (en) * 2011-08-12 2015-11-17 ルネサスエレクトロニクス株式会社 Video decoding device and image display device
JP6101267B2 (en) * 2011-08-18 2017-03-22 アザーヴァース デジタル インコーポレーテッドUtherverse Digital, Inc. Virtual world interaction system and method
US10659724B2 (en) * 2011-08-24 2020-05-19 Ati Technologies Ulc Method and apparatus for providing dropped picture image processing
US9606209B2 (en) 2011-08-26 2017-03-28 Kineticor, Inc. Methods, systems, and devices for intra-scan motion correction
US20130050596A1 (en) * 2011-08-30 2013-02-28 Industrial Technology Research Institute Auto-stereoscopic display and method for fabricating the same
JP2013050537A (en) * 2011-08-30 2013-03-14 Sony Corp Display device and electronic apparatus
JP2013050539A (en) * 2011-08-30 2013-03-14 Sony Corp Display device and electronic apparatus
JP2013050538A (en) 2011-08-30 2013-03-14 Sony Corp Display device and electronic apparatus
WO2013032221A1 (en) * 2011-08-31 2013-03-07 엘지전자 주식회사 Digital broadcast signal processing method and device
US8872813B2 (en) 2011-09-02 2014-10-28 Adobe Systems Incorporated Parallax image authoring and viewing in digital media
CN102368244B (en) * 2011-09-08 2013-05-15 广州市动景计算机科技有限公司 Page content alignment method, device and mobile terminal browser
DE112012003931T5 (en) 2011-09-21 2014-07-10 Magna Electronics, Inc. Image processing system for a motor vehicle with image data transmission and power supply via a coaxial cable
CN102510503B (en) * 2011-09-30 2015-06-03 深圳超多维光电子有限公司 Stereoscopic display method and stereoscopic display equipment
JP5715539B2 (en) * 2011-10-06 2015-05-07 株式会社ジャパンディスプレイ Display device and electronic device
KR20130037861A (en) * 2011-10-07 2013-04-17 삼성디스플레이 주식회사 Display apparatus and method of displaying three dimensional image using the same
KR101813035B1 (en) * 2011-10-10 2017-12-28 엘지전자 주식회사 Mobile terminal and method for controlling the same
WO2013055164A1 (en) * 2011-10-13 2013-04-18 삼성전자 주식회사 Method for displaying contents, method for synchronizing contents, and method and device for displaying broadcast contents
GB2495725B (en) * 2011-10-18 2014-10-01 Sony Comp Entertainment Europe Image transfer apparatus and method
JP5149435B1 (en) * 2011-11-04 2013-02-20 株式会社東芝 Video processing apparatus and video processing method
US8933935B2 (en) 2011-11-10 2015-01-13 7D Surgical Inc. Method of rendering and manipulating anatomical images on mobile computing device
KR101887058B1 (en) * 2011-11-11 2018-08-09 엘지전자 주식회사 A process for processing a three-dimensional image and a method for controlling electric power of the same
WO2013073428A1 (en) * 2011-11-15 2013-05-23 シャープ株式会社 Display device
US9942580B2 (en) * 2011-11-18 2018-04-10 At&T Intellecutal Property I, L.P. System and method for automatically selecting encoding/decoding for streaming media
US20130127841A1 (en) * 2011-11-18 2013-05-23 Samsung Electronics Co., Ltd. Three-dimensional (3d) image display method and apparatus for 3d imaging and displaying contents according to start or end of operation
US8660362B2 (en) * 2011-11-21 2014-02-25 Microsoft Corporation Combined depth filtering and super resolution
US10071687B2 (en) 2011-11-28 2018-09-11 Magna Electronics Inc. Vision system for vehicle
DE102011055967B4 (en) * 2011-12-02 2016-03-10 Seereal Technologies S.A. Measuring method and device for carrying out the measuring method
US9626798B2 (en) 2011-12-05 2017-04-18 At&T Intellectual Property I, L.P. System and method to digitally replace objects in images or video
CN103163650A (en) * 2011-12-08 2013-06-19 武汉天马微电子有限公司 Naked eye three-dimensional (3D) grating structure
US20130156090A1 (en) * 2011-12-14 2013-06-20 Ati Technologies Ulc Method and apparatus for enabling multiuser use
US9042266B2 (en) * 2011-12-21 2015-05-26 Kik Interactive, Inc. Methods and apparatus for initializing a network connection for an output device
KR20140018414A (en) * 2011-12-22 2014-02-12 텐센트 테크놀로지(센젠) 컴퍼니 리미티드 Browser based application program extension method and device
CN202995143U (en) * 2011-12-29 2013-06-12 三星电子株式会社 Glasses device and display device
EP2611176A3 (en) * 2011-12-29 2015-11-18 Samsung Electronics Co., Ltd. Display apparatus and controlling method thereof
US9392251B2 (en) 2011-12-29 2016-07-12 Samsung Electronics Co., Ltd. Display apparatus, glasses apparatus and method for controlling depth
TWI467235B (en) * 2012-02-06 2015-01-01 Innocom Tech Shenzhen Co Ltd Three-dimensional (3d) display and displaying method thereof
CN103294453B (en) * 2012-02-24 2017-02-22 华为技术有限公司 Image processing method and image processing device
US9324190B2 (en) 2012-02-24 2016-04-26 Matterport, Inc. Capturing and aligning three-dimensional scenes
US10848731B2 (en) 2012-02-24 2020-11-24 Matterport, Inc. Capturing and aligning panoramic image and depth data
US11094137B2 (en) 2012-02-24 2021-08-17 Matterport, Inc. Employing three-dimensional (3D) data predicted from two-dimensional (2D) images using neural networks for 3D modeling applications and other applications
KR20130098023A (en) * 2012-02-27 2013-09-04 한국전자통신연구원 Apparatus and method for displaying an image on 3-dimentional display based on multi-layer parallax barrier
JP5942477B2 (en) * 2012-02-29 2016-06-29 富士ゼロックス株式会社 Setting device and program
JP5762998B2 (en) * 2012-03-07 2015-08-12 株式会社ジャパンディスプレイ Display device and electronic device
JP6015743B2 (en) * 2012-03-07 2016-10-26 ソニー株式会社 Information processing apparatus, information processing method, and program
JP5779124B2 (en) * 2012-03-13 2015-09-16 株式会社ジャパンディスプレイ Display device and electronic device
JP5806150B2 (en) * 2012-03-13 2015-11-10 株式会社ジャパンディスプレイ Display device
CN102650741B (en) * 2012-03-16 2014-06-11 京东方科技集团股份有限公司 Light splitting device, manufacturing method thereof and 3D (Three-Dimensional) display device
US9280042B2 (en) * 2012-03-16 2016-03-08 City University Of Hong Kong Automatic switching of a multi-mode projector display screen for displaying three-dimensional and two-dimensional images
WO2013135203A1 (en) 2012-03-16 2013-09-19 Tencent Technology (Shenzhen) Company Limited Offline download method and system
US9733707B2 (en) 2012-03-22 2017-08-15 Honeywell International Inc. Touch screen display user interface and method for improving touch interface utility on the same employing a rules-based masking system
US20130265297A1 (en) * 2012-04-06 2013-10-10 Motorola Mobility, Inc. Display of a Corrected Browser Projection of a Visual Guide for Placing a Three Dimensional Object in a Browser
US9308439B2 (en) * 2012-04-10 2016-04-12 Bally Gaming, Inc. Controlling three-dimensional presentation of wagering game content
WO2013153418A1 (en) * 2012-04-12 2013-10-17 Sony Mobile Communications Ab Improved 3d image display system
KR101923150B1 (en) * 2012-04-16 2018-11-29 삼성디스플레이 주식회사 Display apparatus and method of displaying three dimensional image using the same
CN102645959A (en) * 2012-04-16 2012-08-22 上海颖杰计算机系统设备有限公司 3D (Three Dimensional) integrated computer
US20150062315A1 (en) * 2012-04-18 2015-03-05 The Regents Of The University Of California Simultaneous 2d and 3d images on a display
EP2653906B1 (en) 2012-04-20 2022-08-24 Dolby Laboratories Licensing Corporation A system for delivering stereoscopic images
CN103379362B (en) * 2012-04-24 2017-07-07 腾讯科技(深圳)有限公司 VOD method and system
US9707892B2 (en) * 2012-04-25 2017-07-18 Gentex Corporation Multi-focus optical system
US20130290867A1 (en) * 2012-04-27 2013-10-31 Litera Technologies, LLC Systems and Methods For Providing Dynamic and Interactive Viewing and Control of Applications
KR20130123599A (en) * 2012-05-03 2013-11-13 한국과학기술원 Speed dependent automatic dimming technique
CN103457960B (en) 2012-05-15 2018-03-09 腾讯科技(深圳)有限公司 The method and system of load document in web game
US10089537B2 (en) 2012-05-18 2018-10-02 Magna Electronics Inc. Vehicle vision system with front and rear camera integration
JP2015525370A (en) * 2012-06-01 2015-09-03 コーニンクレッカ フィリップス エヌ ヴェ Autostereoscopic display device and driving method
US9201270B2 (en) * 2012-06-01 2015-12-01 Leia Inc. Directional backlight with a modulation layer
US8570651B1 (en) * 2012-06-04 2013-10-29 Hae-Yong Choi Both side screen for combined use of 2D/3D images
US9159153B2 (en) 2012-06-05 2015-10-13 Apple Inc. Method, system and apparatus for providing visual feedback of a map view change
US9111380B2 (en) 2012-06-05 2015-08-18 Apple Inc. Rendering maps
US9886794B2 (en) 2012-06-05 2018-02-06 Apple Inc. Problem reporting in maps
US10176633B2 (en) 2012-06-05 2019-01-08 Apple Inc. Integrated mapping and navigation application
US9482296B2 (en) 2012-06-05 2016-11-01 Apple Inc. Rendering road signs during navigation
US9997069B2 (en) 2012-06-05 2018-06-12 Apple Inc. Context-aware voice guidance
US9418672B2 (en) 2012-06-05 2016-08-16 Apple Inc. Navigation application with adaptive instruction text
US9367959B2 (en) * 2012-06-05 2016-06-14 Apple Inc. Mapping application with 3D presentation
US8965696B2 (en) 2012-06-05 2015-02-24 Apple Inc. Providing navigation instructions while operating navigation application in background
US9230556B2 (en) 2012-06-05 2016-01-05 Apple Inc. Voice instructions during navigation
JP6046923B2 (en) * 2012-06-07 2016-12-21 キヤノン株式会社 Image coding apparatus, image coding method, and program
US9773338B2 (en) * 2012-06-08 2017-09-26 Lg Electronics Inc. Rendering method of 3D web-page and terminal using the same
US9829996B2 (en) * 2012-06-25 2017-11-28 Zspace, Inc. Operations in a three dimensional display system
US20140195983A1 (en) * 2012-06-30 2014-07-10 Yangzhou Du 3d graphical user interface
KR101649660B1 (en) * 2012-07-06 2016-08-19 엘지전자 주식회사 Terminal for increasing visual comfort sensation of 3d object and control method thereof
US20140022241A1 (en) * 2012-07-18 2014-01-23 Electronics And Telecommunications Research Institute Display apparatus and method based on symmetrically spb
US10353718B2 (en) * 2012-07-23 2019-07-16 Vmware, Inc. Providing access to a remote application via a web client
US9491784B2 (en) * 2012-07-31 2016-11-08 Apple Inc. Streaming common media content to multiple devices
US8959176B2 (en) 2012-07-31 2015-02-17 Apple Inc. Streaming common media content to multiple devices
US9786281B1 (en) * 2012-08-02 2017-10-10 Amazon Technologies, Inc. Household agent learning
CA2822217A1 (en) 2012-08-02 2014-02-02 Iwatchlife Inc. Method and system for anonymous video analytics processing
US9423871B2 (en) * 2012-08-07 2016-08-23 Honeywell International Inc. System and method for reducing the effects of inadvertent touch on a touch screen controller
KR101994295B1 (en) * 2012-08-08 2019-06-28 삼성전자주식회사 Terminal and method for generating live image in terminal
US9225972B2 (en) 2012-08-10 2015-12-29 Pixtronix, Inc. Three dimensional (3D) image generation using electromechanical display elements
US9198209B2 (en) 2012-08-21 2015-11-24 Cisco Technology, Inc. Providing integrated end-to-end architecture that includes quality of service transport for tunneled traffic
CN103631021B (en) * 2012-08-27 2016-06-15 群康科技(深圳)有限公司 3 d display device and image display method thereof
TWI509289B (en) * 2012-08-27 2015-11-21 Innocom Tech Shenzhen Co Ltd Stereoscopic display apparatus and image display method thereof
KR20140028780A (en) * 2012-08-30 2014-03-10 삼성디스플레이 주식회사 Display apparatus and method of displaying three dimensional image using the same
US9811878B1 (en) * 2012-09-04 2017-11-07 Amazon Technologies, Inc. Dynamic processing of image borders
US10171540B2 (en) * 2012-09-07 2019-01-01 High Sec Labs Ltd Method and apparatus for streaming video security
CN104412610A (en) * 2012-09-14 2015-03-11 日立麦克赛尔株式会社 Video display device and terminal device
JP5837009B2 (en) * 2012-09-26 2015-12-24 キヤノン株式会社 Display device and control method thereof
CN104104934B (en) * 2012-10-04 2019-02-19 陈笛 The component and method of the more spectators' Three-dimensional Displays of glasses-free
JP5928286B2 (en) * 2012-10-05 2016-06-01 富士ゼロックス株式会社 Information processing apparatus and program
US9798150B2 (en) * 2012-10-10 2017-10-24 Broadcast 3Dtv, Inc. System for distributing auto-stereoscopic images
US20140104242A1 (en) * 2012-10-12 2014-04-17 Nvidia Corporation System and method for concurrent display of a video signal on a plurality of display devices
CN102917265A (en) * 2012-10-25 2013-02-06 深圳创维-Rgb电子有限公司 Information browsing method and system based on network television
US9235103B2 (en) * 2012-10-25 2016-01-12 Au Optronics Corporation 3D liquid crystal display comprising four electrodes alternately arrange between a first and second substrate
TWI452345B (en) * 2012-10-26 2014-09-11 Au Optronics Corp Three dimensions display device and displaying method thereof
US9161018B2 (en) * 2012-10-26 2015-10-13 Christopher L. UHL Methods and systems for synthesizing stereoscopic images
JP2014092744A (en) * 2012-11-06 2014-05-19 Japan Display Inc Stereoscopic display device
CN102981343B (en) * 2012-11-21 2015-01-07 京东方科技集团股份有限公司 Convertible lens and preparation method thereof, as well as two-dimensional and three-dimensional display surface substrate and display device
CN104516168B (en) * 2012-11-21 2018-05-08 京东方科技集团股份有限公司 Convertible lens and preparation method thereof, 2 d-3 d display base plate and display device
US9674510B2 (en) * 2012-11-21 2017-06-06 Elwha Llc Pulsed projection system for 3D video
US9547937B2 (en) * 2012-11-30 2017-01-17 Legend3D, Inc. Three-dimensional annotation system and method
WO2014085910A1 (en) 2012-12-04 2014-06-12 Interaxon Inc. System and method for enhancing content using brain-state data
US9128580B2 (en) 2012-12-07 2015-09-08 Honeywell International Inc. System and method for interacting with a touch screen interface utilizing an intelligent stencil mask
US20140165209A1 (en) * 2012-12-11 2014-06-12 Verizon Patent And Licensing Inc. Digital content delivery platform for multiple retailers
US9497448B2 (en) * 2012-12-31 2016-11-15 Lg Display Co., Ltd. Image processing method of transparent display apparatus and apparatus thereof
TWI531213B (en) * 2013-01-18 2016-04-21 國立成功大學 Image conversion method and module for naked-eye 3d display
US9305365B2 (en) 2013-01-24 2016-04-05 Kineticor, Inc. Systems, devices, and methods for tracking moving targets
US9717461B2 (en) 2013-01-24 2017-08-01 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US10327708B2 (en) 2013-01-24 2019-06-25 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
WO2014120734A1 (en) 2013-02-01 2014-08-07 Kineticor, Inc. Motion tracking system for real time adaptive motion compensation in biomedical imaging
WO2014129134A1 (en) * 2013-02-19 2014-08-28 パナソニック株式会社 Image display device
TWI502247B (en) * 2013-02-26 2015-10-01 Chunghwa Picture Tubes Ltd Autostereoscopic display device and display method thereof
US8712217B1 (en) 2013-03-01 2014-04-29 Comcast Cable Communications, Llc Methods and systems for time-shifting content
US20140267601A1 (en) * 2013-03-14 2014-09-18 Corel Corporation System and method for efficient editing of 3d video
US20140268324A1 (en) * 2013-03-18 2014-09-18 3-D Virtual Lens Technologies, Llc Method of displaying 3d images from 2d source images using a barrier grid
CN103236074B (en) * 2013-03-25 2015-12-23 深圳超多维光电子有限公司 A kind of 2D/3D image processing method and device
US10110647B2 (en) * 2013-03-28 2018-10-23 Qualcomm Incorporated Method and apparatus for altering bandwidth consumption
KR101981530B1 (en) 2013-03-29 2019-05-23 엘지디스플레이 주식회사 Stereoscopic image display device and method for driving the same
CN103235415B (en) * 2013-04-01 2015-12-23 昆山龙腾光电有限公司 Based on the multi-view free stereoscopic displayer of grating
KR101970577B1 (en) * 2013-04-09 2019-04-19 엘지디스플레이 주식회사 Stereoscopic display device and eye-tracking method thereof
US20140316907A1 (en) * 2013-04-17 2014-10-23 Asaf NAIM Multilayered user interface for internet browser
CN103293689B (en) * 2013-05-31 2015-05-13 京东方科技集团股份有限公司 Method capable of switching between different display modes and display device
KR20140142863A (en) * 2013-06-05 2014-12-15 한국전자통신연구원 Apparatus and method for providing graphic editors
TWI510813B (en) * 2013-06-18 2015-12-01 Zhangjiagang Kangde Xin Optronics Material Co Ltd A liquid crystal parallax barrier device that displays three-dimensional images in both directions
CN104238185B (en) * 2013-06-19 2017-04-12 扬升照明股份有限公司 Light source module, display device and light source module drive method
CN103309639A (en) * 2013-06-21 2013-09-18 广东威创视讯科技股份有限公司 Method and device based on split screen display of three-dimensional scene
US10003789B2 (en) 2013-06-24 2018-06-19 The Regents Of The University Of California Practical two-frame 3D+2D TV
CN103365657B (en) * 2013-06-28 2019-03-15 北京智谷睿拓技术服务有限公司 Display control method, device and the display equipment including the device
TWI495904B (en) * 2013-07-12 2015-08-11 Vision Technology Co Ltd C Field sequential color lcd and method for generating 3d images by matching a software optical grating
US9418469B1 (en) 2013-07-19 2016-08-16 Outward, Inc. Generating video content
JP2015025968A (en) * 2013-07-26 2015-02-05 ソニー株式会社 Presentation medium and display device
US9678929B2 (en) * 2013-08-01 2017-06-13 Equldo Limited Stereoscopic online web content creation and rendering
TWI489148B (en) * 2013-08-23 2015-06-21 Au Optronics Corp Stereoscopic display and the driving method
TWI505243B (en) * 2013-09-10 2015-10-21 Zhangjiagang Kangde Xin Optronics Material Co Ltd A device that can display 2D and 3D images at the same time
KR101856568B1 (en) * 2013-09-16 2018-06-19 삼성전자주식회사 Multi view image display apparatus and controlling method thereof
US10592064B2 (en) * 2013-09-17 2020-03-17 Amazon Technologies, Inc. Approaches for three-dimensional object display used in content navigation
US10067634B2 (en) 2013-09-17 2018-09-04 Amazon Technologies, Inc. Approaches for three-dimensional object display
US9392355B1 (en) * 2013-09-19 2016-07-12 Voyetra Turtle Beach, Inc. Gaming headset with voice scrambling for private in-game conversations
US20160255322A1 (en) * 2013-10-07 2016-09-01 Vid Scale, Inc. User adaptive 3d video rendering and delivery
CN103508999B (en) * 2013-10-12 2015-05-13 浙江海正药业股份有限公司 Maxacalcitol synthesizing intermediate and preparation method and application thereof
US9986228B2 (en) 2016-03-24 2018-05-29 3Di Llc Trackable glasses system that provides multiple views of a shared display
US9883173B2 (en) 2013-12-25 2018-01-30 3Di Llc Stereoscopic display
US10116914B2 (en) * 2013-10-31 2018-10-30 3Di Llc Stereoscopic display
US10652525B2 (en) 2013-10-31 2020-05-12 3Di Llc Quad view display system
US11343487B2 (en) 2013-10-31 2022-05-24 David Woods Trackable glasses system for perspective views of a display
JP6411862B2 (en) * 2013-11-15 2018-10-24 パナソニック株式会社 File generation method and file generation apparatus
KR20150057064A (en) * 2013-11-18 2015-05-28 엘지전자 주식회사 Electronic device and control method thereof
US20150138184A1 (en) * 2013-11-20 2015-05-21 Apple Inc. Spatially interactive computing device
TWI511112B (en) * 2013-11-27 2015-12-01 Acer Inc Image display method and display system
CN103605211B (en) * 2013-11-27 2016-04-20 南京大学 Tablet non-auxiliary stereo display device and method
KR20150065056A (en) * 2013-12-04 2015-06-12 삼성디스플레이 주식회사 Image display apparatus
US9988047B2 (en) 2013-12-12 2018-06-05 Magna Electronics Inc. Vehicle control system with traffic driving control
US20150189256A1 (en) * 2013-12-16 2015-07-02 Christian Stroetmann Autostereoscopic multi-layer display and control approaches
CN103676302B (en) * 2013-12-31 2016-04-06 京东方科技集团股份有限公司 Realize array base palte, display device and method that 2D/3D display switches
US10409079B2 (en) 2014-01-06 2019-09-10 Avegant Corp. Apparatus, system, and method for displaying an image using a plate
US10303242B2 (en) 2014-01-06 2019-05-28 Avegant Corp. Media chair apparatus, system, and method
JP6467680B2 (en) * 2014-01-10 2019-02-13 パナソニックIpマネジメント株式会社 File generation method and file generation apparatus
CA2937702C (en) * 2014-01-22 2022-06-21 AI Squared Emphasizing a portion of the visible content elements of a markup language document
US10291907B2 (en) 2014-01-23 2019-05-14 Telefonaktiebolaget Lm Ericsson (Publ) Multi-view display control for channel selection
US9182605B2 (en) * 2014-01-29 2015-11-10 Emine Goulanian Front-projection autostereoscopic 3D display system
US10375365B2 (en) 2014-02-07 2019-08-06 Samsung Electronics Co., Ltd. Projection system with enhanced color and contrast
US10565925B2 (en) 2014-02-07 2020-02-18 Samsung Electronics Co., Ltd. Full color display with intrinsic transparency
US10453371B2 (en) 2014-02-07 2019-10-22 Samsung Electronics Co., Ltd. Multi-layer display with color and contrast enhancement
US10554962B2 (en) 2014-02-07 2020-02-04 Samsung Electronics Co., Ltd. Multi-layer high transparency display for light field generation
CN103792672B (en) 2014-02-14 2016-03-23 成都京东方光电科技有限公司 Stereo display assembly, liquid crystal panel and display device
CN104853008B (en) * 2014-02-17 2020-05-19 北京三星通信技术研究有限公司 Portable device and method capable of switching between two-dimensional display and three-dimensional display
KR101678389B1 (en) * 2014-02-28 2016-11-22 엔트릭스 주식회사 Method for providing media data based on cloud computing, apparatus and system
CN103903548B (en) * 2014-03-07 2016-03-02 京东方科技集团股份有限公司 A kind of driving method of display panel and drive system
US20150253974A1 (en) 2014-03-07 2015-09-10 Sony Corporation Control of large screen display using wireless portable computer interfacing with display controller
WO2015148391A1 (en) 2014-03-24 2015-10-01 Thomas Michael Ernst Systems, methods, and devices for removing prospective motion correction from medical imaging scans
US9373306B2 (en) * 2014-03-25 2016-06-21 Intel Coporation Direct viewer projection
KR102175813B1 (en) * 2014-04-18 2020-11-09 삼성디스플레이 주식회사 Three dimensional image display device and method of processing image
US20150334367A1 (en) * 2014-05-13 2015-11-19 Nagravision S.A. Techniques for displaying three dimensional objects
US9838756B2 (en) * 2014-05-20 2017-12-05 Electronics And Telecommunications Research Institute Method and apparatus for providing three-dimensional territorial broadcasting based on non real time service
KR102204830B1 (en) * 2014-05-20 2021-01-19 한국전자통신연구원 Method and apparatus for providing three-dimensional territorial brordcasting based on non real time service
CN104023223B (en) * 2014-05-29 2016-03-02 京东方科技集团股份有限公司 Display control method, Apparatus and system
CN104090365A (en) * 2014-06-18 2014-10-08 京东方科技集团股份有限公司 Shutter glasses, display device, display system and display method
US10613585B2 (en) * 2014-06-19 2020-04-07 Samsung Electronics Co., Ltd. Transparent display apparatus, group play system using transparent display apparatus and performance methods thereof
GB2527548A (en) * 2014-06-25 2015-12-30 Sharp Kk Variable barrier pitch correction
CN104155769A (en) * 2014-07-15 2014-11-19 深圳市亿思达显示科技有限公司 2D/3D co-fusion display device and advertizing device
CN104090818A (en) * 2014-07-16 2014-10-08 北京智谷睿拓技术服务有限公司 Information processing method, device and system
TWI556624B (en) * 2014-07-18 2016-11-01 友達光電股份有限公司 Image displaying method and image dispaly device
CN104252058B (en) * 2014-07-18 2017-06-20 京东方科技集团股份有限公司 Grating control method and device, grating, display panel and 3D display devices
WO2016014718A1 (en) 2014-07-23 2016-01-28 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
JP6064084B2 (en) * 2014-07-30 2017-01-18 オリンパス株式会社 Image processing device
WO2016021861A1 (en) * 2014-08-02 2016-02-11 Samsung Electronics Co., Ltd. Electronic device and user interaction method thereof
KR102366677B1 (en) * 2014-08-02 2022-02-23 삼성전자주식회사 Apparatus and Method for User Interaction thereof
CN105323654B (en) * 2014-08-05 2019-02-15 优视科技有限公司 The method and apparatus for carrying out the content-data of automatic network is presented
JP6327062B2 (en) * 2014-08-25 2018-05-23 オムロン株式会社 Display device
US9925980B2 (en) 2014-09-17 2018-03-27 Magna Electronics Inc. Vehicle collision avoidance system with enhanced pedestrian avoidance
US10750153B2 (en) 2014-09-22 2020-08-18 Samsung Electronics Company, Ltd. Camera system for three-dimensional video
US11205305B2 (en) 2014-09-22 2021-12-21 Samsung Electronics Company, Ltd. Presentation of three-dimensional video
WO2016046068A1 (en) 2014-09-25 2016-03-31 Koninklijke Philips N.V. Display device with directional control of the output, and a backlight for such a display device
FR3026589A1 (en) * 2014-09-30 2016-04-01 Orange METHOD AND DEVICE FOR ADAPTING THE DISPLAY OF A VIDEO STREAM BY A CLIENT
FR3026852B1 (en) * 2014-10-03 2016-12-02 Thales Sa SEMI-TRANSPARENT SCREEN DISPLAY SYSTEM SHARED BY TWO OBSERVERS
US10506295B2 (en) * 2014-10-09 2019-12-10 Disney Enterprises, Inc. Systems and methods for delivering secondary content to viewers
KR102266064B1 (en) * 2014-10-15 2021-06-18 삼성디스플레이 주식회사 Method of driving display panel, display panel driving apparatus and display apparatus having the display panel driving apparatus
US20160119685A1 (en) * 2014-10-21 2016-04-28 Samsung Electronics Co., Ltd. Display method and display device
CN104361622B (en) * 2014-10-31 2018-06-19 福建星网视易信息系统有限公司 A kind of interface method for drafting and device
DE102014225796A1 (en) * 2014-12-15 2016-06-16 Bayerische Motoren Werke Aktiengesellschaft Method for controlling a vehicle system
CN104461440B (en) * 2014-12-31 2018-01-02 上海天马有机发光显示技术有限公司 Rendering intent, rendering device and display device
CN107209406B (en) 2015-01-10 2021-07-27 镭亚股份有限公司 Two-dimensional/three-dimensional (2D/3D) switchable display backlight and electronic display
WO2016111709A1 (en) 2015-01-10 2016-07-14 Leia Inc. Diffraction grating-based backlighting having controlled diffractive coupling efficiency
KR102239155B1 (en) 2015-01-10 2021-04-12 레이아 인코포레이티드 Polarization-mixing light guide and multipbeam grating-based backlighting using same
EP3248058B1 (en) 2015-01-19 2020-05-06 LEIA Inc. Unidirectional grating-based backlighting employing a reflective island
KR20160089600A (en) * 2015-01-19 2016-07-28 삼성디스플레이 주식회사 Display device
US9690110B2 (en) * 2015-01-21 2017-06-27 Apple Inc. Fine-coarse autostereoscopic display
EP3250960B1 (en) * 2015-01-28 2023-06-07 LEIA Inc. Three-dimensional (3d) electronic display
US20160227156A1 (en) * 2015-02-02 2016-08-04 Hisense Hiview Tech Co., Ltd. Modular television system
JP6359990B2 (en) * 2015-02-24 2018-07-18 株式会社ジャパンディスプレイ Display device and display method
JP6359989B2 (en) * 2015-02-24 2018-07-18 株式会社ジャパンディスプレイ Display device and display method
TWI554788B (en) * 2015-03-04 2016-10-21 友達光電股份有限公司 Display device
KR102321364B1 (en) * 2015-03-05 2021-11-03 삼성전자주식회사 Method for synthesizing a 3d backgroud content and device thereof
PT3271761T (en) 2015-03-16 2021-06-25 Leia Inc Unidirectional grating-based backlighting employing an angularly selective reflective layer
JP6411257B2 (en) * 2015-03-19 2018-10-24 株式会社ジャパンディスプレイ Display device and control method thereof
US9823474B2 (en) 2015-04-02 2017-11-21 Avegant Corp. System, apparatus, and method for displaying an image with a wider field of view
US9995857B2 (en) 2015-04-03 2018-06-12 Avegant Corp. System, apparatus, and method for displaying an image using focal modulation
US9846309B2 (en) * 2015-04-17 2017-12-19 Dongseo University Technology Headquarters Depth-priority integral imaging display method using nonuniform dynamic mask array
JP6961491B2 (en) 2015-04-23 2021-11-05 レイア、インコーポレイテッドLeia Inc. Double light-guided grid-based backlight and electronic display with the same backlight
US10360617B2 (en) 2015-04-24 2019-07-23 Walmart Apollo, Llc Automated shopping apparatus and method in response to consumption
US9705936B2 (en) 2015-04-24 2017-07-11 Mersive Technologies, Inc. System and method for interactive and real-time visualization of distributed media
JP2018517242A (en) 2015-05-09 2018-06-28 レイア、インコーポレイテッドLeia Inc. Color scanning grid based backlight and electronic display using the backlight
CN104834104B (en) * 2015-05-25 2017-05-24 京东方科技集团股份有限公司 2D/3D switchable display panel, and display method and display device thereof
ES2819239T3 (en) 2015-05-30 2021-04-15 Leia Inc Vehicle display system
US10904091B2 (en) 2015-06-03 2021-01-26 Avago Technologies International Sales Pte. Limited System for network-based reallocation of functions
CN104883559A (en) * 2015-06-06 2015-09-02 深圳市虚拟现实科技有限公司 Video playing method and video playing device
CN104851394B (en) * 2015-06-10 2017-11-28 京东方科技集团股份有限公司 A kind of display device and display methods
CN104849870B (en) * 2015-06-12 2018-01-09 京东方科技集团股份有限公司 Display panel and display device
US10362342B2 (en) * 2015-06-16 2019-07-23 Lg Electronics Inc. Broadcast signal transmission device, broadcast signal reception device, broadcast signal transmission method, and broadcast signal reception method
US9846310B2 (en) * 2015-06-22 2017-12-19 Innolux Corporation 3D image display device with improved depth ranges
GB2540376A (en) * 2015-07-14 2017-01-18 Sharp Kk Parallax barrier with independently controllable regions
GB2540377A (en) 2015-07-14 2017-01-18 Sharp Kk Parallax barrier with independently controllable regions
FR3038995B1 (en) * 2015-07-15 2018-05-11 F4 INTERACTIVE DEVICE WITH CUSTOMIZABLE DISPLAY
US10349488B2 (en) 2015-07-17 2019-07-09 Abl Ip Holding Llc Software configurable lighting device
WO2017015056A1 (en) * 2015-07-17 2017-01-26 Abl Ip Holding Llc Arrangements for software configurable lighting device
EP3325401A1 (en) 2015-07-17 2018-05-30 ABL IP Holding LLC Systems and methods to provide configuration data to a software configurable lighting device
US9943247B2 (en) 2015-07-28 2018-04-17 The University Of Hawai'i Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan
US10079000B2 (en) 2015-08-12 2018-09-18 Microsoft Technology Licensing, Llc Reducing display degradation
CN105100783B (en) 2015-08-19 2018-03-23 京东方科技集团股份有限公司 3D display device and 3D display method
US10186188B2 (en) * 2015-09-23 2019-01-22 Motorola Solutions, Inc. Multi-angle simultaneous view light-emitting diode display
EP3148188A1 (en) * 2015-09-24 2017-03-29 Airbus Operations GmbH Virtual windows for airborne verhicles
FR3042620B1 (en) 2015-10-16 2017-12-08 F4 INTERACTIVE WEB DEVICE WITH CUSTOMIZABLE DISPLAY
CN106254845B (en) * 2015-10-20 2017-08-25 深圳超多维光电子有限公司 A kind of method of bore hole stereoscopic display, device and electronic equipment
CN105306866A (en) * 2015-10-27 2016-02-03 青岛海信电器股份有限公司 Frame rate conversion method and device
TWI708099B (en) * 2015-11-10 2020-10-21 荷蘭商皇家飛利浦有限公司 Display device and display control method
EP3374231A4 (en) * 2015-11-13 2019-05-29 Harman International Industries, Incorporated User interface for in-vehicle system
US20170148488A1 (en) * 2015-11-20 2017-05-25 Mediatek Inc. Video data processing system and associated method for analyzing and summarizing recorded video data
US10144419B2 (en) 2015-11-23 2018-12-04 Magna Electronics Inc. Vehicle dynamic control system for emergency handling
WO2017091479A1 (en) 2015-11-23 2017-06-01 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US9711128B2 (en) * 2015-12-04 2017-07-18 Opentv, Inc. Combined audio for multiple content presentation
GB2562430A (en) 2016-01-19 2018-11-14 Walmart Apollo Llc Consumable item ordering system
IL260825B (en) * 2016-01-29 2022-06-01 Magic Leap Inc Display for three-dimensional image
US10373544B1 (en) 2016-01-29 2019-08-06 Leia, Inc. Transformation from tiled to composite images
US10063917B2 (en) 2016-03-16 2018-08-28 Sorenson Media Inc. Fingerprint layouts for content fingerprinting
US10691880B2 (en) * 2016-03-29 2020-06-23 Microsoft Technology Licensing, Llc Ink in an electronic document
US10200428B1 (en) * 2016-03-30 2019-02-05 Amazon Technologies, Inc. Unicast routing of a media stream to subscribers
US10185787B1 (en) * 2016-04-06 2019-01-22 Bentley Systems, Incorporated Tool for accurate onsite model visualization that facilitates environment interaction
US10256277B2 (en) * 2016-04-11 2019-04-09 Abl Ip Holding Llc Luminaire utilizing a transparent organic light emitting device display
WO2017188955A1 (en) * 2016-04-28 2017-11-02 Hewlett-Packard Development Company, L.P. Digital display devices
US10353534B2 (en) 2016-05-13 2019-07-16 Sap Se Overview page in multi application user interface
US10579238B2 (en) 2016-05-13 2020-03-03 Sap Se Flexible screen layout across multiple platforms
TWI626475B (en) * 2016-06-08 2018-06-11 國立交通大學 Stereoscopic display screen and stereoscopic display system
CN105842865B (en) * 2016-06-21 2018-01-30 成都工业学院 A kind of slim grating 3D display device based on slit grating
CN106257321B (en) * 2016-06-28 2021-11-30 京东方科技集团股份有限公司 3D head-up display system and method
US20180035236A1 (en) * 2016-07-28 2018-02-01 Leonardo Basterra Audio System with Binaural Elements and Method of Use with Perspective Switching
US10154253B2 (en) * 2016-08-29 2018-12-11 Disney Enterprises, Inc. Multi-view displays using images encoded with orbital angular momentum (OAM) on a pixel or image basis
WO2018044711A1 (en) * 2016-08-31 2018-03-08 Wal-Mart Stores, Inc. Systems and methods of enabling retail shopping while disabling components based on location
US10271043B2 (en) * 2016-11-18 2019-04-23 Zspace, Inc. 3D user interface—360-degree visualization of 2D webpage content
US11003305B2 (en) * 2016-11-18 2021-05-11 Zspace, Inc. 3D user interface
US10127715B2 (en) * 2016-11-18 2018-11-13 Zspace, Inc. 3D user interface—non-native stereoscopic image conversion
US10621898B2 (en) * 2016-11-23 2020-04-14 Pure Depth Limited Multi-layer display system for vehicle dash or the like
US10170060B2 (en) * 2016-12-27 2019-01-01 Facebook Technologies, Llc Interlaced liquid crystal display panel and backlight used in a head mounted display
US10965967B2 (en) 2016-12-31 2021-03-30 Turner Broadcasting System, Inc. Publishing a disparate per-client live media output stream based on dynamic insertion of targeted non-programming content and customized programming content
US11051061B2 (en) 2016-12-31 2021-06-29 Turner Broadcasting System, Inc. Publishing a disparate live media output stream using pre-encoded media assets
US11051074B2 (en) 2016-12-31 2021-06-29 Turner Broadcasting System, Inc. Publishing disparate live media output streams using live input streams
US10992973B2 (en) 2016-12-31 2021-04-27 Turner Broadcasting System, Inc. Publishing a plurality of disparate live media output stream manifests using live input streams and pre-encoded media assets
US11134309B2 (en) 2016-12-31 2021-09-28 Turner Broadcasting System, Inc. Creation of channels using pre-encoded media assets
US10425700B2 (en) 2016-12-31 2019-09-24 Turner Broadcasting System, Inc. Dynamic scheduling and channel creation based on real-time or near-real-time content context analysis
US11503352B2 (en) 2016-12-31 2022-11-15 Turner Broadcasting System, Inc. Dynamic scheduling and channel creation based on external data
US10645462B2 (en) 2016-12-31 2020-05-05 Turner Broadcasting System, Inc. Dynamic channel versioning in a broadcast air chain
US10856016B2 (en) 2016-12-31 2020-12-01 Turner Broadcasting System, Inc. Publishing disparate live media output streams in mixed mode based on user selection
US11109086B2 (en) 2016-12-31 2021-08-31 Turner Broadcasting System, Inc. Publishing disparate live media output streams in mixed mode
US10075753B2 (en) 2016-12-31 2018-09-11 Turner Broadcasting System, Inc. Dynamic scheduling and channel creation based on user selection
US10694231B2 (en) 2016-12-31 2020-06-23 Turner Broadcasting System, Inc. Dynamic channel versioning in a broadcast air chain based on user preferences
US11038932B2 (en) 2016-12-31 2021-06-15 Turner Broadcasting System, Inc. System for establishing a shared media session for one or more client devices
CN108287679A (en) * 2017-01-10 2018-07-17 中兴通讯股份有限公司 A kind of display characteristic parameter adjusting method and terminal
CN106710531B (en) * 2017-01-19 2019-11-05 深圳市华星光电技术有限公司 Backlight control circuit and electronic device
US11044464B2 (en) * 2017-02-09 2021-06-22 Fyusion, Inc. Dynamic content modification of image and video based multi-view interactive digital media representations
US10650416B1 (en) * 2017-02-17 2020-05-12 Sprint Communications Company L.P. Live production interface and response testing
US10210833B2 (en) * 2017-03-31 2019-02-19 Panasonic Liquid Crystal Display Co., Ltd. Display device
US10078135B1 (en) * 2017-04-25 2018-09-18 Intel Corporation Identifying a physical distance using audio channels
KR102335725B1 (en) 2017-05-14 2021-12-07 레이아 인코포레이티드 Multiview backlights, displays, and methods using active emitters
US10375375B2 (en) 2017-05-15 2019-08-06 Lg Electronics Inc. Method of providing fixed region information or offset region information for subtitle in virtual reality system and device for controlling the same
FR3066672B1 (en) * 2017-05-19 2020-05-22 Sagemcom Broadband Sas METHOD FOR COMMUNICATING AN IMMERSIVE VIDEO
US10939169B2 (en) 2017-05-25 2021-03-02 Turner Broadcasting System, Inc. Concurrent presentation of non-programming media assets with programming media content at client device
IL270856B2 (en) 2017-05-30 2023-12-01 Magic Leap Inc Power supply assembly with fan assembly for electronic device
CN110785741A (en) * 2017-06-16 2020-02-11 微软技术许可有限责任公司 Generating user interface containers
CN107146573B (en) * 2017-06-26 2020-05-01 上海天马有机发光显示技术有限公司 Display panel, display method thereof and display device
US20190026004A1 (en) * 2017-07-18 2019-01-24 Chicago Labs, LLC Three Dimensional Icons for Computer Applications
EP3658778A4 (en) 2017-07-28 2021-04-14 Magic Leap, Inc. Fan assembly for displaying an image
CN107396087B (en) * 2017-07-31 2019-03-12 京东方科技集团股份有限公司 Naked eye three-dimensional display device and its control method
US10692279B2 (en) * 2017-07-31 2020-06-23 Quantum Spatial, Inc. Systems and methods for facilitating making partial selections of multidimensional information while maintaining a multidimensional structure
US11049218B2 (en) 2017-08-11 2021-06-29 Samsung Electronics Company, Ltd. Seamless image stitching
US10515397B2 (en) * 2017-09-08 2019-12-24 Uptown Network LLC System and method for facilitating virtual gift giving
CN108205411A (en) * 2017-09-30 2018-06-26 中兴通讯股份有限公司 Display changeover method and device, terminal
CN107707901B (en) * 2017-09-30 2019-10-25 深圳超多维科技有限公司 It is a kind of for the display methods of naked eye 3D display screen, device and equipment
US10212532B1 (en) 2017-12-13 2019-02-19 At&T Intellectual Property I, L.P. Immersive media with media device
US11132842B2 (en) * 2017-12-22 2021-09-28 Unity IPR ApS Method and system for synchronizing a plurality of augmented reality devices to a virtual reality device
JP2019154008A (en) * 2018-03-06 2019-09-12 シャープ株式会社 Stereoscopic image display device, method for displaying liquid crystal display, and program for liquid crystal display
CN108469682A (en) * 2018-03-30 2018-08-31 京东方科技集团股份有限公司 A kind of three-dimensional display apparatus and its 3 D displaying method
CN108490703B (en) * 2018-04-03 2021-10-15 京东方科技集团股份有限公司 Display system and display control method thereof
US11025892B1 (en) 2018-04-04 2021-06-01 James Andrew Aman System and method for simultaneously providing public and private images
US10523922B2 (en) 2018-04-06 2019-12-31 Zspace, Inc. Identifying replacement 3D images for 2D images via ranking criteria
US10523921B2 (en) 2018-04-06 2019-12-31 Zspace, Inc. Replacing 2D images with 3D images
US10999573B2 (en) * 2018-04-25 2021-05-04 Raxium, Inc. Partial light field display architecture
US11513405B2 (en) * 2018-04-26 2022-11-29 Semiconductor Energy Laboratory Co., Ltd. Display device and electronic device
US10600246B2 (en) * 2018-06-15 2020-03-24 Microsoft Technology Licensing, Llc Pinning virtual reality passthrough regions to real-world locations
KR102506873B1 (en) * 2018-07-18 2023-03-08 현대자동차주식회사 Vehicle cluster having a three-dimensional effect, system having the same and method providing a three-dimensional scene thereof
WO2020022288A1 (en) * 2018-07-27 2020-01-30 京セラ株式会社 Display device and mobile body
US10887574B2 (en) 2018-07-31 2021-01-05 Intel Corporation Selective packing of patches for immersive video
US11212506B2 (en) 2018-07-31 2021-12-28 Intel Corporation Reduced rendering of six-degree of freedom video
US10762394B2 (en) 2018-07-31 2020-09-01 Intel Corporation System and method for 3D blob classification and transmission
US10893299B2 (en) 2018-07-31 2021-01-12 Intel Corporation Surface normal vector processing mechanism
US11178373B2 (en) 2018-07-31 2021-11-16 Intel Corporation Adaptive resolution of point cloud and viewpoint prediction for video streaming in computing environments
US10757324B2 (en) 2018-08-03 2020-08-25 Semiconductor Components Industries, Llc Transform processors for gradually switching between image transforms
US11057631B2 (en) 2018-10-10 2021-07-06 Intel Corporation Point cloud coding standard conformance definition in computing environments
US11727859B2 (en) 2018-10-25 2023-08-15 Boe Technology Group Co., Ltd. Display panel and display device
CN109192136B (en) * 2018-10-25 2020-12-22 京东方科技集团股份有限公司 Display substrate, light field display device and driving method thereof
US10880534B2 (en) * 2018-11-09 2020-12-29 Korea Electronics Technology Institute Electronic device and method for tiled video multi-channel playback
KR102023905B1 (en) * 2018-11-09 2019-11-04 전자부품연구원 Electronic device and method for multi-channel reproduction of tiled image
US10699673B2 (en) * 2018-11-19 2020-06-30 Facebook Technologies, Llc Apparatus, systems, and methods for local dimming in brightness-controlled environments
CN109598254B (en) * 2018-12-17 2019-11-26 海南大学 The space representation combined optimization method of Group-oriented
US10880606B2 (en) 2018-12-21 2020-12-29 Turner Broadcasting System, Inc. Disparate live media output stream playout and broadcast distribution
US11082734B2 (en) 2018-12-21 2021-08-03 Turner Broadcasting System, Inc. Publishing a disparate live media output stream that complies with distribution format regulations
US10873774B2 (en) 2018-12-22 2020-12-22 Turner Broadcasting System, Inc. Publishing a disparate live media output stream manifest that includes one or more media segments corresponding to key events
CN109725819B (en) * 2018-12-25 2022-12-13 浙江玖炫智能信息技术有限公司 Interface display method and device, double-screen double-system terminal and readable storage medium
US10854171B2 (en) 2018-12-31 2020-12-01 Samsung Electronics Co., Ltd. Multi-user personal display system and applications thereof
EP3687166A1 (en) * 2019-01-23 2020-07-29 Ultra-D Coöperatief U.A. Interoperable 3d image content handling
CN109686303B (en) * 2019-01-28 2021-09-17 厦门天马微电子有限公司 Organic light-emitting display panel, organic light-emitting display device and compensation method
JP7317517B2 (en) * 2019-02-12 2023-07-31 株式会社ジャパンディスプレイ Display device
CN110007475A (en) * 2019-04-17 2019-07-12 万维云视(上海)数码科技有限公司 Utilize the method and apparatus of virtual depth compensation eyesight
US10571744B1 (en) 2019-04-18 2020-02-25 Apple Inc. Displays with adjustable direct-lit backlight units and power consumption compensation
US10964275B2 (en) 2019-04-18 2021-03-30 Apple Inc. Displays with adjustable direct-lit backlight units and adaptive processing
US10504453B1 (en) 2019-04-18 2019-12-10 Apple Inc. Displays with adjustable direct-lit backlight units
CN110262051B (en) * 2019-07-26 2023-12-29 成都工业学院 Retroreflective stereoscopic display device based on directional light source
EP3779612A1 (en) * 2019-08-16 2021-02-17 The Swatch Group Research and Development Ltd Method for broadcasting a message to the wearer of a watch
CN112394845B (en) * 2019-08-19 2024-03-01 北京小米移动软件有限公司 Distance sensor module, display device, electronic equipment and distance detection method
US11335095B1 (en) * 2019-08-27 2022-05-17 Gopro, Inc. Systems and methods for characterizing visual content
WO2021045733A1 (en) * 2019-09-03 2021-03-11 Light Field Lab, Inc. Light field display system for gaming environments
CN111415629B (en) * 2020-04-28 2022-02-22 Tcl华星光电技术有限公司 Display device driving method and display device
US11750795B2 (en) 2020-05-12 2023-09-05 Apple Inc. Displays with viewer tracking
CN112505942B (en) * 2021-02-03 2021-04-20 成都工业学院 Multi-resolution stereoscopic display device based on rear projection light source
CN113992885B (en) * 2021-09-22 2023-03-21 联想(北京)有限公司 Data synchronization method and device
NL2030325B1 (en) * 2021-12-28 2023-07-03 Dimenco Holding B V Scaling of three-dimensional content for an autostereoscopic display device
KR20230112485A (en) * 2022-01-20 2023-07-27 엘지전자 주식회사 Display device and operating method thereof
CN114936002A (en) * 2022-06-10 2022-08-23 斑马网络技术有限公司 Interface display method and device and vehicle

Citations (97)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4829365A (en) * 1986-03-07 1989-05-09 Dimension Technologies, Inc. Autostereoscopic display with illuminating lines, light valve and mask
US5615046A (en) * 1995-01-23 1997-03-25 Cyber Scientific Inc. Stereoscopic viewing system
US5855425A (en) * 1996-07-19 1999-01-05 Sanyo Electric Co., Ltd. Stereoscopic display
US5945965A (en) * 1995-06-29 1999-08-31 Canon Kabushiki Kaisha Stereoscopic image display method
US5959597A (en) * 1995-09-28 1999-09-28 Sony Corporation Image/audio reproducing system
US5969850A (en) * 1996-09-27 1999-10-19 Sharp Kabushiki Kaisha Spatial light modulator, directional display and directional light source
US5990975A (en) * 1996-11-22 1999-11-23 Acer Peripherals, Inc. Dual screen displaying device
US6023277A (en) * 1996-07-03 2000-02-08 Canon Kabushiki Kaisha Display control apparatus and method
US6049424A (en) * 1995-11-15 2000-04-11 Sanyo Electric Co., Ltd. Three dimensional display device
US6094216A (en) * 1995-05-22 2000-07-25 Canon Kabushiki Kaisha Stereoscopic image display method, and stereoscopic image display apparatus using the method
US6144375A (en) * 1998-08-14 2000-11-07 Praja Inc. Multi-perspective viewer for content-based interactivity
US6188442B1 (en) * 1997-08-01 2001-02-13 International Business Machines Corporation Multiviewer display system for television monitors
US6285368B1 (en) * 1997-02-10 2001-09-04 Canon Kabushiki Kaisha Image display system and image display apparatus and information processing apparatus in the system
US20020010798A1 (en) * 2000-04-20 2002-01-24 Israel Ben-Shaul Differentiated content and application delivery via internet
US20020037037A1 (en) * 2000-09-22 2002-03-28 Philips Electronics North America Corporation Preferred transmission/streaming order of fine-granular scalability
US20020167862A1 (en) * 2001-04-03 2002-11-14 Carlo Tomasi Method and apparatus for approximating a source position of a sound-causing event for determining an input used in operating an electronic device
US20020171666A1 (en) * 1999-02-19 2002-11-21 Takaaki Endo Image processing apparatus for interpolating and generating images from an arbitrary view point
US20030103165A1 (en) * 2000-05-19 2003-06-05 Werner Bullinger System for operating a consumer electronics appaliance
US20030137506A1 (en) * 2001-11-30 2003-07-24 Daniel Efran Image-based rendering for 3D viewing
US20030154261A1 (en) * 1994-10-17 2003-08-14 The Regents Of The University Of California, A Corporation Of The State Of California Distributed hypermedia method and system for automatically invoking external application providing interaction and display of embedded objects within a hypermedia document
US20030223499A1 (en) * 2002-04-09 2003-12-04 Nicholas Routhier Process and system for encoding and playback of stereoscopic video sequences
US20040027452A1 (en) * 2002-08-07 2004-02-12 Yun Kug Jin Method and apparatus for multiplexing multi-view three-dimensional moving picture
US6697687B1 (en) * 1998-11-09 2004-02-24 Hitachi, Ltd. Image display apparatus having audio output control means in accordance with image signal type
US20040036763A1 (en) * 1994-11-14 2004-02-26 Swift David C. Intelligent method and system for producing and displaying stereoscopically-multiplexed images of three-dimensional objects for use in realistic stereoscopic viewing thereof in interactive virtual reality display environments
US20040041747A1 (en) * 2002-08-27 2004-03-04 Nec Corporation 3D image/2D image switching display apparatus and portable terminal device
US6710920B1 (en) * 1998-03-27 2004-03-23 Sanyo Electric Co., Ltd Stereoscopic display
US20040109093A1 (en) * 2002-12-05 2004-06-10 Small-Stryker Aaron Tug Method and apparatus for simultaneous television video presentation and separate viewing of different broadcasts
US20040141237A1 (en) * 1995-06-07 2004-07-22 Wohlstadter Jacob N. Three dimensional imaging system
US20040164292A1 (en) * 2003-02-21 2004-08-26 Yeh-Jiun Tung Transflective display having an OLED backlight
US20040239231A1 (en) * 2002-10-30 2004-12-02 Keisuke Miyagawa Display device and electronic equipment
US20040252187A1 (en) * 2001-09-10 2004-12-16 Alden Ray M. Processes and apparatuses for efficient multiple program and 3D display
US20050073472A1 (en) * 2003-07-26 2005-04-07 Samsung Electronics Co., Ltd. Method of removing Moire pattern in 3D image display apparatus using complete parallax
US20050128353A1 (en) * 2003-12-16 2005-06-16 Young Bruce A. System and method for using second remote control device for sub-picture control in television receiver
US20050237487A1 (en) * 2004-04-23 2005-10-27 Chang Nelson L A Color wheel assembly for stereoscopic imaging
US20050248561A1 (en) * 2002-04-25 2005-11-10 Norio Ito Multimedia information generation method and multimedia information reproduction device
US20060050785A1 (en) * 2004-09-09 2006-03-09 Nucore Technology Inc. Inserting a high resolution still image into a lower resolution video stream
US7030903B2 (en) * 1997-02-20 2006-04-18 Canon Kabushiki Kaisha Image display system, information processing apparatus, and method of controlling the same
US7038698B1 (en) * 1996-02-08 2006-05-02 Palm Charles S 3D stereo browser for the internet
US20060109242A1 (en) * 2004-11-19 2006-05-25 Simpkins Daniel S User interface for impaired users
US20060139490A1 (en) * 2004-12-15 2006-06-29 Fekkes Wilhelmus F Synchronizing audio with delayed video
US20060139448A1 (en) * 2004-12-29 2006-06-29 Samsung Electronics Co., Ltd. 3D displays with flexible switching capability of 2D/3D viewing modes
US7123213B2 (en) * 1995-10-05 2006-10-17 Semiconductor Energy Laboratory Co., Ltd. Three dimensional display unit and display method
US20060244918A1 (en) * 2005-04-27 2006-11-02 Actuality Systems, Inc. Minimized-thickness angular scanner of electromagnetic radiation
US20060256302A1 (en) * 2005-05-13 2006-11-16 Microsoft Corporation Three-dimensional (3D) image projection
US20060256136A1 (en) * 2001-10-01 2006-11-16 Adobe Systems Incorporated, A Delaware Corporation Compositing two-dimensional and three-dimensional image layers
US20060271791A1 (en) * 2005-05-27 2006-11-30 Sbc Knowledge Ventures, L.P. Method and system for biometric based access control of media content presentation devices
US20070002041A1 (en) * 2005-07-02 2007-01-04 Samsung Electronics Co., Ltd. Method and apparatus for encoding/decoding video data to implement local three-dimensional video
US20070008620A1 (en) * 2005-07-11 2007-01-11 Samsung Electronics Co., Ltd. Switchable autostereoscopic display
US20070008406A1 (en) * 2005-07-08 2007-01-11 Samsung Electronics Co., Ltd. High resolution 2D-3D switchable autostereoscopic display apparatus
US20070052807A1 (en) * 2005-09-07 2007-03-08 Fuji Xerox Co., Ltd. System and method for user monitoring interface of 3-D video streams from multiple cameras
US20070085814A1 (en) * 2003-09-20 2007-04-19 Koninklijke Philips Electronics N.V. Image display device
US20070097208A1 (en) * 2003-05-28 2007-05-03 Satoshi Takemoto Stereoscopic image display apparatus, text data processing apparatus, program, and storing medium
US20070097103A1 (en) * 2003-09-11 2007-05-03 Shoji Yoshioka Portable display device
US20070096125A1 (en) * 2005-06-24 2007-05-03 Uwe Vogel Illumination device
US20070139371A1 (en) * 2005-04-04 2007-06-21 Harsham Bret A Control system and method for differentiating multiple users utilizing multi-view display devices
US20070146267A1 (en) * 2005-12-22 2007-06-28 Lg.Philips Lcd Co., Ltd. Display device and method of driving the same
US20070147827A1 (en) * 2005-12-28 2007-06-28 Arnold Sheynman Methods and apparatus for wireless stereo video streaming
US20070153916A1 (en) * 2005-12-30 2007-07-05 Sharp Laboratories Of America, Inc. Wireless video transmission system
US20070162392A1 (en) * 2006-01-12 2007-07-12 Microsoft Corporation Management of Streaming Content
US20070258140A1 (en) * 2006-05-04 2007-11-08 Samsung Electronics Co., Ltd. Multiview autostereoscopic display
US20070270218A1 (en) * 2006-05-08 2007-11-22 Nintendo Co., Ltd. Storage medium having game program stored thereon and game apparatus
US20070296874A1 (en) * 2004-10-20 2007-12-27 Fujitsu Ten Limited Display Device,Method of Adjusting the Image Quality of the Display Device, Device for Adjusting the Image Quality and Device for Adjusting the Contrast
US20080025390A1 (en) * 2006-07-25 2008-01-31 Fang Shi Adaptive video frame interpolation
US20080037120A1 (en) * 2006-08-08 2008-02-14 Samsung Electronics Co., Ltd High resolution 2d/3d switchable display apparatus
US20080043644A1 (en) * 2006-08-18 2008-02-21 Microsoft Corporation Techniques to perform rate matching for multimedia conference calls
US20080043096A1 (en) * 2006-04-04 2008-02-21 Anthony Vetro Method and System for Decoding and Displaying 3D Light Fields
US20080068329A1 (en) * 2006-09-15 2008-03-20 Samsung Electronics Co., Ltd. Multi-view autostereoscopic display with improved resolution
US7359105B2 (en) * 2006-02-07 2008-04-15 Sharp Kabushiki Kaisha Spatial light modulator and a display device
US20080126557A1 (en) * 2006-09-08 2008-05-29 Tetsuro Motoyama System, method, and computer program product using an SNMP implementation to obtain vendor information from remote devices
US20080133122A1 (en) * 2006-03-29 2008-06-05 Sanyo Electric Co., Ltd. Multiple visual display device and vehicle-mounted navigation system
US20080150853A1 (en) * 2006-12-22 2008-06-26 Hong Kong Applied Science and Technology Research Institute Company Limited Backlight device and liquid crystal display incorporating the backlight device
US20080168129A1 (en) * 2007-01-08 2008-07-10 Jeffrey Robbin Pairing a Media Server and a Media Client
US20080165176A1 (en) * 2006-09-28 2008-07-10 Charles Jens Archer Method of Video Display and Multiplayer Gaming
US20080192112A1 (en) * 2005-03-18 2008-08-14 Ntt Data Sanyo System Corporation Stereoscopic Image Display Apparatus, Stereoscopic Image Displaying Method And Computer Program Product
US20080191964A1 (en) * 2005-04-22 2008-08-14 Koninklijke Philips Electronics, N.V. Auto-Stereoscopic Display With Mixed Mode For Concurrent Display of Two- and Three-Dimensional Images
US20080246757A1 (en) * 2005-04-25 2008-10-09 Masahiro Ito 3D Image Generation and Display System
US7440193B2 (en) * 2004-04-30 2008-10-21 Gunasekaran R Alfred Wide-angle variable focal length lens system
US20080259233A1 (en) * 2005-12-20 2008-10-23 Koninklijke Philips Electronics, N.V. Autostereoscopic Display Device
US20080273242A1 (en) * 2003-09-30 2008-11-06 Graham John Woodgate Directional Display Apparatus
US20080284844A1 (en) * 2003-02-05 2008-11-20 Graham John Woodgate Switchable Lens
US20080303832A1 (en) * 2007-06-11 2008-12-11 Samsung Electronics Co., Ltd. Method of generating two-dimensional/three-dimensional convertible stereoscopic image bitstream and method and apparatus for displaying the same
US20090002178A1 (en) * 2007-06-29 2009-01-01 Microsoft Corporation Dynamic mood sensing
US20090010264A1 (en) * 2006-03-21 2009-01-08 Huawei Technologies Co., Ltd. Method and System for Ensuring QoS and SLA Server
US20090052164A1 (en) * 2007-08-24 2009-02-26 Masako Kashiwagi Directional backlight, display apparatus, and stereoscopic display apparatus
US20090051759A1 (en) * 2005-05-27 2009-02-26 Adkins Sean M Equipment and methods for the synchronization of stereoscopic projection displays
US20090058845A1 (en) * 2004-10-20 2009-03-05 Yasuhiro Fukuda Display device
US7511774B2 (en) * 2005-11-30 2009-03-31 Samsung Mobile Display Co., Ltd. Three-dimensional display device
US20090115800A1 (en) * 2005-01-18 2009-05-07 Koninklijke Philips Electronics, N.V. Multi-view display device
US20090115783A1 (en) * 2007-11-02 2009-05-07 Dimension Technologies, Inc. 3d optical illusions from off-axis displays
US20090133051A1 (en) * 2007-11-21 2009-05-21 Gesturetek, Inc. Device access control
US20090138805A1 (en) * 2007-11-21 2009-05-28 Gesturetek, Inc. Media preferences
US20090141182A1 (en) * 2007-12-03 2009-06-04 Panasonic Corporation Digital broadcast receiving apparatus, semiconductor integrated circuit, and digital broadcast receiving method
US20090167639A1 (en) * 2008-01-02 2009-07-02 3M Innovative Properties Company Methods of reducing perceived image crosstalk in a multiview display
US20090174700A1 (en) * 2005-03-31 2009-07-09 Casio Computer Co., Ltd. Illuminator for emitting at least two lights having directivity and display apparatus using same
US20090232202A1 (en) * 2004-12-10 2009-09-17 Koninklijke Philips Electronics, N.V. Wireless video streaming using single layer coding and prioritized streaming
US20090238378A1 (en) * 2008-03-18 2009-09-24 Invism, Inc. Enhanced Immersive Soundscapes Production
US20090319625A1 (en) * 2008-06-20 2009-12-24 Alcatel Lucent Interactivity in a digital public signage network architecture

Family Cites Families (166)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS56109649A (en) 1980-02-05 1981-08-31 Matsushita Electric Ind Co Ltd Ultrasonic diagnosing device
JPH05122733A (en) * 1991-10-28 1993-05-18 Nippon Hoso Kyokai <Nhk> Three-dimensional picture display device
US5493427A (en) 1993-05-25 1996-02-20 Sharp Kabushiki Kaisha Three-dimensional display unit with a variable lens
US7190518B1 (en) * 1996-01-22 2007-03-13 3Ality, Inc. Systems for and methods of three dimensional viewing
JPH10232626A (en) * 1997-02-20 1998-09-02 Canon Inc Stereoscopic image display device
US6590605B1 (en) 1998-10-14 2003-07-08 Dimension Technologies, Inc. Autostereoscopic display
US6757422B1 (en) 1998-11-12 2004-06-29 Canon Kabushiki Kaisha Viewpoint position detection apparatus and method, and stereoscopic image display system
US6533420B1 (en) 1999-01-22 2003-03-18 Dimension Technologies, Inc. Apparatus and method for generating and projecting autostereoscopic images
US6591306B1 (en) * 1999-04-01 2003-07-08 Nec Corporation IP network access for portable devices
US20050177850A1 (en) * 1999-10-29 2005-08-11 United Video Properties, Inc. Interactive television system with programming-related links
US8271336B2 (en) 1999-11-22 2012-09-18 Accenture Global Services Gmbh Increased visibility during order management in a network-based supply chain environment
US7389214B1 (en) 2000-05-01 2008-06-17 Accenture, Llp Category analysis in a market management
US6765568B2 (en) 2000-06-12 2004-07-20 Vrex, Inc. Electronic stereoscopic media delivery system
US6856581B1 (en) 2000-10-31 2005-02-15 International Business Machines Corporation Batteryless, oscillatorless, binary time cell usable as an horological device with associated programming methods and devices
AU2002232928A1 (en) 2000-11-03 2002-05-15 Zoesis, Inc. Interactive character system
DE10103922A1 (en) 2001-01-30 2002-08-01 Physoptics Opto Electronic Gmb Interactive data viewing and operating system
US20020194604A1 (en) 2001-06-19 2002-12-19 Sanchez Elizabeth C. Interactive television virtual shopping cart
JP2003322824A (en) * 2002-02-26 2003-11-14 Namco Ltd Stereoscopic video display device and electronic apparatus
JP3738843B2 (en) 2002-06-11 2006-01-25 ソニー株式会社 Image detection apparatus, image detection method, and image detection program
EP1529400A4 (en) 2002-07-16 2009-09-23 Korea Electronics Telecomm Apparatus and method for adapting 2d and 3d stereoscopic video signal
JP2004072202A (en) 2002-08-01 2004-03-04 Ktfreetel Co Ltd Separate billing method of communication utility charge and apparatus therefor
US20080008202A1 (en) 2002-10-31 2008-01-10 Terrell William C Router with routing processors and methods for virtualization
US7769668B2 (en) 2002-12-09 2010-08-03 Sam Balabon System and method for facilitating trading of financial instruments
US8799366B2 (en) 2002-12-11 2014-08-05 Broadcom Corporation Migration of stored media through a media exchange network
US8270810B2 (en) 2002-12-11 2012-09-18 Broadcom Corporation Method and system for advertisement insertion and playback for STB with PVR functionality
CA2457602A1 (en) 2003-02-19 2004-08-19 Impatica Inc. Method of synchronizing streams of real time data
US8438601B2 (en) 2003-07-02 2013-05-07 Rovi Solutions Corporation Resource management for a networked personal video recording system
US7557876B2 (en) 2003-07-25 2009-07-07 Nitto Denko Corporation Anisotropic fluorescent thin crystal film and backlight system and liquid crystal display incorporating the same
GB0326005D0 (en) 2003-11-07 2003-12-10 Koninkl Philips Electronics Nv Waveguide for autostereoscopic display
US7488072B2 (en) 2003-12-04 2009-02-10 New York University Eye tracked foveal display by controlled illumination
WO2005071474A2 (en) 2004-01-20 2005-08-04 Sharp Kabushiki Kaisha Directional backlight and multiple view display device
US7091471B2 (en) 2004-03-15 2006-08-15 Agilent Technologies, Inc. Using eye detection for providing control and power management of electronic devices
US20060087556A1 (en) 2004-10-21 2006-04-27 Kazunari Era Stereoscopic image display device
EP1838899A2 (en) 2004-11-30 2007-10-03 Agoura Technologies Inc. Applications and fabrication techniques for large scale wire grid polarizers
KR100786862B1 (en) 2004-11-30 2007-12-20 삼성에스디아이 주식회사 Barrier device, three dimensional image display using the same and method thereof
KR100732961B1 (en) 2005-04-01 2007-06-27 경희대학교 산학협력단 Multiview scalable image encoding, decoding method and its apparatus
RU2322771C2 (en) 2005-04-25 2008-04-20 Святослав Иванович АРСЕНИЧ Stereo-projection system
ES2860754T3 (en) 2005-04-29 2021-10-05 Koninklijke Philips Nv A stereoscopic display apparatus
KR100661241B1 (en) 2005-05-16 2006-12-22 엘지전자 주식회사 Fabrication method of optical sheet
GB2426351A (en) * 2005-05-19 2006-11-22 Sharp Kk A dual view display
KR100813961B1 (en) * 2005-06-14 2008-03-14 삼성전자주식회사 Method and apparatus for transmitting and receiving of video, and transport stream structure thereof
US9465450B2 (en) 2005-06-30 2016-10-11 Koninklijke Philips N.V. Method of controlling a system
KR100647517B1 (en) 2005-08-26 2006-11-23 (주)마스터이미지 Cell type parallax-barrier and stereoscopic image display apparatus using the same
US8157651B2 (en) 2005-09-12 2012-04-17 Nintendo Co., Ltd. Information processing program
CN101300520B (en) 2005-11-02 2012-11-14 皇家飞利浦电子股份有限公司 Optical system for 3-dimensional display
US20070110035A1 (en) 2005-11-14 2007-05-17 Broadcom Corporation, A California Corporation Network nodes cooperatively routing traffic flow amongst wired and wireless networks
JP5121136B2 (en) 2005-11-28 2013-01-16 株式会社ジャパンディスプレイウェスト Image display device, electronic device, portable device, and image display method
KR100739067B1 (en) 2005-11-30 2007-07-12 삼성에스디아이 주식회사 Three-dimensional display device
JP5305922B2 (en) * 2005-12-20 2013-10-02 コーニンクレッカ フィリップス エヌ ヴェ Autostereoscopic display device
US20070153122A1 (en) 2005-12-30 2007-07-05 Ayite Nii A Apparatus and method for simultaneous multiple video channel viewing
JP5508721B2 (en) 2006-02-10 2014-06-04 リアルディー インコーポレイテッド Multifunctional active matrix liquid crystal display
US20070225994A1 (en) * 2006-03-17 2007-09-27 Moore Barrett H Method for Providing Private Civil Security Services Bundled with Second Party Products
US8368749B2 (en) * 2006-03-27 2013-02-05 Ge Inspection Technologies Lp Article inspection apparatus
US8466954B2 (en) 2006-04-03 2013-06-18 Sony Computer Entertainment Inc. Screen sharing method and apparatus
KR100893616B1 (en) 2006-04-17 2009-04-20 삼성모바일디스플레이주식회사 Electronic imaging device, 2d/3d image display device and the driving method thereof
TWI378747B (en) 2006-08-18 2012-12-01 Ind Tech Res Inst Flexible electronic assembly
US20110090413A1 (en) 2006-08-18 2011-04-21 Industrial Technology Research Institute 3-dimensional image display
US7844547B2 (en) 2006-08-21 2010-11-30 Carl Raymond Amos Uncle gem IV, universal automatic instant money, data and precious metal and stone transfer machine
US8587638B2 (en) 2006-09-25 2013-11-19 Nokia Corporation Supporting a 3D presentation
JP4669482B2 (en) * 2006-09-29 2011-04-13 セイコーエプソン株式会社 Display device, image processing method, and electronic apparatus
US8645176B2 (en) 2006-10-05 2014-02-04 Trimble Navigation Limited Utilizing historical data in an asset management environment
US20080086685A1 (en) * 2006-10-05 2008-04-10 James Janky Method for delivering tailored asset information to a device
US20080086391A1 (en) 2006-10-05 2008-04-10 Kurt Maynard Impromptu asset tracking
US7640223B2 (en) 2006-11-16 2009-12-29 University Of Tennessee Research Foundation Method of organizing and presenting data in a table using stutter peak rule
US7586681B2 (en) * 2006-11-29 2009-09-08 Honeywell International Inc. Directional display
TW200834151A (en) 2006-11-30 2008-08-16 Westar Display Technologies Inc Motion artifact measurement for display devices
JP4285532B2 (en) 2006-12-01 2009-06-24 ソニー株式会社 Backlight control device, backlight control method, and liquid crystal display device
US8248462B2 (en) * 2006-12-15 2012-08-21 The Board Of Trustees Of The University Of Illinois Dynamic parallax barrier autosteroscopic display system and method
JP4686795B2 (en) * 2006-12-27 2011-05-25 富士フイルム株式会社 Image generating apparatus and image reproducing apparatus
US7924456B1 (en) 2007-01-12 2011-04-12 Broadbus Technologies, Inc. Data distribution and buffering
CN101013559A (en) 2007-01-30 2007-08-08 京东方科技集团股份有限公司 LED brightness control circuit and backlight of LCD
JP4255032B2 (en) 2007-03-15 2009-04-15 富士通テン株式会社 Display device and display method
US7917853B2 (en) 2007-03-21 2011-03-29 At&T Intellectual Property I, L.P. System and method of presenting media content
US8269822B2 (en) 2007-04-03 2012-09-18 Sony Computer Entertainment America, LLC Display viewing system and methods for optimizing display view based on active tracking
US8600932B2 (en) 2007-05-07 2013-12-03 Trimble Navigation Limited Telematic asset microfluidic analysis
GB0709134D0 (en) * 2007-05-11 2007-06-20 Surman Philip Multi-user autostereoscopic Display
GB0709411D0 (en) 2007-05-16 2007-06-27 Barco Nv Methods and systems for stereoscopic imaging
TWI466093B (en) * 2007-06-26 2014-12-21 Apple Inc Management techniques for video playback
KR101400285B1 (en) 2007-08-03 2014-05-30 삼성전자주식회사 Front light unit and flat display apparatus employing the same
US7911442B2 (en) 2007-08-27 2011-03-22 Au Optronics Corporation Dynamic color gamut of LED backlight
KR101362647B1 (en) 2007-09-07 2014-02-12 삼성전자주식회사 System and method for generating and palying three dimensional image file including two dimensional image
US7881976B2 (en) * 2007-09-27 2011-02-01 Virgin Mobile Usa, L.P. Apparatus, methods and systems for discounted referral and recommendation of electronic content
GB2453323A (en) 2007-10-01 2009-04-08 Sharp Kk Flexible backlight arrangement and display
TWI354115B (en) * 2007-10-05 2011-12-11 Ind Tech Res Inst Three-dimensional display apparatus
US8416247B2 (en) 2007-10-09 2013-04-09 Sony Computer Entertaiment America Inc. Increasing the number of advertising impressions in an interactive environment
US8031175B2 (en) 2008-04-21 2011-10-04 Panasonic Corporation Touch sensitive remote control system that detects hand size characteristics of user and adapts mapping to screen display
US8121191B1 (en) 2007-11-13 2012-02-21 Harmonic Inc. AVC to SVC transcoder
JP4956520B2 (en) 2007-11-13 2012-06-20 ミツミ電機株式会社 Backlight device and liquid crystal display device using the same
KR101439845B1 (en) * 2007-11-16 2014-09-12 삼성전자주식회사 Digital image processing apparatus
US20090138280A1 (en) * 2007-11-26 2009-05-28 The General Electric Company Multi-stepped default display protocols
TWI365302B (en) * 2007-12-31 2012-06-01 Ind Tech Res Inst Stereo image display with switch function between horizontal display and vertical display
JP2011514980A (en) 2008-02-08 2011-05-12 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Autostereoscopic display device
KR101451565B1 (en) 2008-02-13 2014-10-16 삼성전자 주식회사 Autostereoscopic display system
JP5642347B2 (en) 2008-03-07 2014-12-17 ミツミ電機株式会社 LCD backlight device
KR101488199B1 (en) * 2008-03-12 2015-01-30 삼성전자주식회사 Method and apparatus for processing and reproducing image, and computer readable medium thereof
US20090244266A1 (en) 2008-03-26 2009-10-01 Thomas Carl Brigham Enhanced Three Dimensional Television
JP4925354B2 (en) 2008-03-31 2012-04-25 富士フイルム株式会社 Image processing apparatus, image display apparatus, imaging apparatus, and image processing method
GB0806183D0 (en) 2008-04-04 2008-05-14 Picsel Res Ltd Presentation of objects in 3D displays
US20090282429A1 (en) * 2008-05-07 2009-11-12 Sony Ericsson Mobile Communications Ab Viewer tracking for displaying three dimensional views
DE102008001644B4 (en) * 2008-05-08 2010-03-04 Seereal Technologies S.A. Device for displaying three-dimensional images
US20090295791A1 (en) 2008-05-29 2009-12-03 Microsoft Corporation Three-dimensional environment created from video
CN101291415B (en) 2008-05-30 2010-07-21 华为终端有限公司 Method, apparatus and system for three-dimensional video communication
TWI401658B (en) 2008-07-18 2013-07-11 Hannstar Display Corp Gate line driving circuit of lcd panel
JP5127633B2 (en) 2008-08-25 2013-01-23 三菱電機株式会社 Content playback apparatus and method
US20100070987A1 (en) 2008-09-12 2010-03-18 At&T Intellectual Property I, L.P. Mining viewer responses to multimedia content
JP2010074557A (en) 2008-09-18 2010-04-02 Toshiba Corp Television receiver
WO2010032419A1 (en) 2008-09-18 2010-03-25 パナソニック株式会社 Image decoding device, image encoding device, image decoding method, image encoding method, and program
KR101497511B1 (en) * 2008-09-19 2015-03-02 삼성전자주식회사 APPARATUS FOR MULTIPLEXING 2 DIMENSIONAL and 3 DIMENSIONAL IMAGE AND VIDEO
KR20100033067A (en) 2008-09-19 2010-03-29 삼성전자주식회사 Image display apparatus and method for both 2d and 3d image
CA2691727C (en) 2008-09-30 2016-10-04 Panasonic Corporation Recording medium, playback device, system lsi, playback method, glasses, and display device for 3d images
US20100107184A1 (en) 2008-10-23 2010-04-29 Peter Rae Shintani TV with eye detection
US8752087B2 (en) 2008-11-07 2014-06-10 At&T Intellectual Property I, L.P. System and method for dynamically constructing personalized contextual video programs
CN102224737B (en) 2008-11-24 2014-12-03 皇家飞利浦电子股份有限公司 Combining 3D video and auxiliary data
US20100128112A1 (en) 2008-11-26 2010-05-27 Samsung Electronics Co., Ltd Immersive display system for interacting with three-dimensional content
US8103608B2 (en) 2008-11-26 2012-01-24 Microsoft Corporation Reference model for data-driven analytics
US20100135640A1 (en) 2008-12-03 2010-06-03 Dell Products L.P. System and Method for Storing and Displaying 3-D Video Content
US8209396B1 (en) 2008-12-10 2012-06-26 Howcast Media, Inc. Video player
CN102272778B (en) 2009-01-07 2015-05-20 汤姆森特许公司 Joint depth estimation
WO2010095440A1 (en) 2009-02-20 2010-08-26 パナソニック株式会社 Recording medium, reproduction device, and integrated circuit
WO2010095381A1 (en) 2009-02-20 2010-08-26 パナソニック株式会社 Recording medium, reproduction device, and integrated circuit
US9565397B2 (en) 2009-02-26 2017-02-07 Akamai Technologies, Inc. Deterministically skewing transmission of content streams
US20100225576A1 (en) 2009-03-03 2010-09-09 Horizon Semiconductors Ltd. Three-dimensional interactive system and method
US8477175B2 (en) 2009-03-09 2013-07-02 Cisco Technology, Inc. System and method for providing three dimensional imaging in a network environment
US20100231511A1 (en) 2009-03-10 2010-09-16 David L. Henty Interactive media system with multi-directional remote control and dual mode camera
WO2010107227A2 (en) 2009-03-16 2010-09-23 Lg Electronics Inc. A method of displaying three-dimensional image data and an apparatus of processing three-dimensional image data
US20100247080A1 (en) * 2009-03-27 2010-09-30 Kug-Jin Yun Method and apparatus for creating and consuming multiview image media file
JP5695819B2 (en) 2009-03-30 2015-04-08 日立マクセル株式会社 TV operation method
WO2010117315A1 (en) 2009-04-09 2010-10-14 Telefonaktiebolaget Lm Ericsson (Publ) Media container file management
JP5815505B2 (en) 2009-04-26 2015-11-17 ナイキ イノベイト セー. フェー. Exercise clock
US8315405B2 (en) 2009-04-28 2012-11-20 Bose Corporation Coordinated ANR reference sound compression
US8532310B2 (en) 2010-03-30 2013-09-10 Bose Corporation Frequency-dependent ANR reference sound compression
US20100280959A1 (en) 2009-05-01 2010-11-04 Darrel Stone Real-time sourcing of service providers
JP4960507B2 (en) 2009-05-15 2012-06-27 株式会社東芝 Video display device and control device
US8788676B2 (en) 2009-05-22 2014-07-22 Motorola Mobility Llc Method and system for controlling data transmission to or from a mobile device
US9237296B2 (en) 2009-06-01 2016-01-12 Lg Electronics Inc. Image display apparatus and operating method thereof
US8704958B2 (en) 2009-06-01 2014-04-22 Lg Electronics Inc. Image display device and operation method thereof
US20100309290A1 (en) 2009-06-08 2010-12-09 Stephen Brooks Myers System for capture and display of stereoscopic content
US9185328B2 (en) 2009-06-08 2015-11-10 Lg Electronics Inc. Device and method for displaying a three-dimensional PIP image
US8411746B2 (en) 2009-06-12 2013-04-02 Qualcomm Incorporated Multiview video coding over MPEG-2 systems
US20100321465A1 (en) 2009-06-19 2010-12-23 Dominique A Behrens Pa Method, System and Computer Program Product for Mobile Telepresence Interactions
EP2573615A3 (en) 2009-08-07 2014-05-07 RealD Inc. Stereoscopic flat panel display with updated blanking intervals
US8976871B2 (en) 2009-09-16 2015-03-10 Qualcomm Incorporated Media extractor tracks for file format track selection
US8446462B2 (en) 2009-10-15 2013-05-21 At&T Intellectual Property I, L.P. Method and system for time-multiplexed shared display
US20110093882A1 (en) 2009-10-21 2011-04-21 Candelore Brant L Parental control through the HDMI interface
KR101600818B1 (en) 2009-11-06 2016-03-09 삼성디스플레이 주식회사 3 three dimensional optical module and display device including the same
US8705624B2 (en) 2009-11-24 2014-04-22 STMicroelectronics International N. V. Parallel decoding for scalable video coding
US8335763B2 (en) 2009-12-04 2012-12-18 Microsoft Corporation Concurrently presented data subfeeds
US8462197B2 (en) 2009-12-17 2013-06-11 Motorola Mobility Llc 3D video transforming device
US20110153362A1 (en) 2009-12-17 2011-06-23 Valin David A Method and mechanism for identifying protecting, requesting, assisting and managing information
US8823782B2 (en) 2009-12-31 2014-09-02 Broadcom Corporation Remote control with integrated position, viewer identification and optical and audio test
US8964013B2 (en) 2009-12-31 2015-02-24 Broadcom Corporation Display with elastic light manipulator
US9247286B2 (en) 2009-12-31 2016-01-26 Broadcom Corporation Frame formatting supporting mixed two and three dimensional video data communication
US8854531B2 (en) 2009-12-31 2014-10-07 Broadcom Corporation Multiple remote controllers that each simultaneously controls a different visual presentation of a 2D/3D display
US8384774B2 (en) 2010-02-15 2013-02-26 Eastman Kodak Company Glasses for viewing stereo images
US20110199469A1 (en) * 2010-02-15 2011-08-18 Gallagher Andrew C Detection and display of stereo images
KR101356248B1 (en) 2010-02-19 2014-01-29 엘지디스플레이 주식회사 Image display device
US9285589B2 (en) 2010-02-28 2016-03-15 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered control of AR eyepiece applications
US8964298B2 (en) 2010-02-28 2015-02-24 Microsoft Corporation Video display modification based on sensor input for a see-through near-to-eye display
US9129295B2 (en) 2010-02-28 2015-09-08 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear
KR101324412B1 (en) 2010-05-06 2013-11-01 엘지디스플레이 주식회사 Stereoscopic image display and driving method thereof
JPWO2011142141A1 (en) 2010-05-13 2013-07-22 パナソニック株式会社 Display device and video viewing system
KR101255711B1 (en) 2010-07-02 2013-04-17 엘지디스플레이 주식회사 3d image display device and driving method thereof
US8605136B2 (en) 2010-08-10 2013-12-10 Sony Corporation 2D to 3D user interface content data conversion
US8363928B1 (en) 2010-12-24 2013-01-29 Trimble Navigation Ltd. General orientation positioning system
JP5640143B2 (en) * 2011-03-31 2014-12-10 富士フイルム株式会社 Imaging apparatus and imaging method
WO2013078317A1 (en) * 2011-11-21 2013-05-30 Schlumberger Technology Corporation Interface for controlling and improving drilling operations

Patent Citations (99)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4829365A (en) * 1986-03-07 1989-05-09 Dimension Technologies, Inc. Autostereoscopic display with illuminating lines, light valve and mask
US20030154261A1 (en) * 1994-10-17 2003-08-14 The Regents Of The University Of California, A Corporation Of The State Of California Distributed hypermedia method and system for automatically invoking external application providing interaction and display of embedded objects within a hypermedia document
US20040036763A1 (en) * 1994-11-14 2004-02-26 Swift David C. Intelligent method and system for producing and displaying stereoscopically-multiplexed images of three-dimensional objects for use in realistic stereoscopic viewing thereof in interactive virtual reality display environments
US5615046A (en) * 1995-01-23 1997-03-25 Cyber Scientific Inc. Stereoscopic viewing system
US6094216A (en) * 1995-05-22 2000-07-25 Canon Kabushiki Kaisha Stereoscopic image display method, and stereoscopic image display apparatus using the method
US20040141237A1 (en) * 1995-06-07 2004-07-22 Wohlstadter Jacob N. Three dimensional imaging system
US6909555B2 (en) * 1995-06-07 2005-06-21 Jacob N. Wohlstadter Three dimensional imaging system
US5945965A (en) * 1995-06-29 1999-08-31 Canon Kabushiki Kaisha Stereoscopic image display method
US5959597A (en) * 1995-09-28 1999-09-28 Sony Corporation Image/audio reproducing system
US7123213B2 (en) * 1995-10-05 2006-10-17 Semiconductor Energy Laboratory Co., Ltd. Three dimensional display unit and display method
US6049424A (en) * 1995-11-15 2000-04-11 Sanyo Electric Co., Ltd. Three dimensional display device
US7038698B1 (en) * 1996-02-08 2006-05-02 Palm Charles S 3D stereo browser for the internet
US6023277A (en) * 1996-07-03 2000-02-08 Canon Kabushiki Kaisha Display control apparatus and method
US5855425A (en) * 1996-07-19 1999-01-05 Sanyo Electric Co., Ltd. Stereoscopic display
US5969850A (en) * 1996-09-27 1999-10-19 Sharp Kabushiki Kaisha Spatial light modulator, directional display and directional light source
US5990975A (en) * 1996-11-22 1999-11-23 Acer Peripherals, Inc. Dual screen displaying device
US6285368B1 (en) * 1997-02-10 2001-09-04 Canon Kabushiki Kaisha Image display system and image display apparatus and information processing apparatus in the system
US7030903B2 (en) * 1997-02-20 2006-04-18 Canon Kabushiki Kaisha Image display system, information processing apparatus, and method of controlling the same
US6188442B1 (en) * 1997-08-01 2001-02-13 International Business Machines Corporation Multiviewer display system for television monitors
US6710920B1 (en) * 1998-03-27 2004-03-23 Sanyo Electric Co., Ltd Stereoscopic display
US6144375A (en) * 1998-08-14 2000-11-07 Praja Inc. Multi-perspective viewer for content-based interactivity
US6697687B1 (en) * 1998-11-09 2004-02-24 Hitachi, Ltd. Image display apparatus having audio output control means in accordance with image signal type
US20020171666A1 (en) * 1999-02-19 2002-11-21 Takaaki Endo Image processing apparatus for interpolating and generating images from an arbitrary view point
US20020010798A1 (en) * 2000-04-20 2002-01-24 Israel Ben-Shaul Differentiated content and application delivery via internet
US20030103165A1 (en) * 2000-05-19 2003-06-05 Werner Bullinger System for operating a consumer electronics appaliance
US20020037037A1 (en) * 2000-09-22 2002-03-28 Philips Electronics North America Corporation Preferred transmission/streaming order of fine-granular scalability
US20020167862A1 (en) * 2001-04-03 2002-11-14 Carlo Tomasi Method and apparatus for approximating a source position of a sound-causing event for determining an input used in operating an electronic device
US20040252187A1 (en) * 2001-09-10 2004-12-16 Alden Ray M. Processes and apparatuses for efficient multiple program and 3D display
US20060256136A1 (en) * 2001-10-01 2006-11-16 Adobe Systems Incorporated, A Delaware Corporation Compositing two-dimensional and three-dimensional image layers
US20030137506A1 (en) * 2001-11-30 2003-07-24 Daniel Efran Image-based rendering for 3D viewing
US20030223499A1 (en) * 2002-04-09 2003-12-04 Nicholas Routhier Process and system for encoding and playback of stereoscopic video sequences
US20050248561A1 (en) * 2002-04-25 2005-11-10 Norio Ito Multimedia information generation method and multimedia information reproduction device
US20040027452A1 (en) * 2002-08-07 2004-02-12 Yun Kug Jin Method and apparatus for multiplexing multi-view three-dimensional moving picture
US20040041747A1 (en) * 2002-08-27 2004-03-04 Nec Corporation 3D image/2D image switching display apparatus and portable terminal device
US20040239231A1 (en) * 2002-10-30 2004-12-02 Keisuke Miyagawa Display device and electronic equipment
US20040109093A1 (en) * 2002-12-05 2004-06-10 Small-Stryker Aaron Tug Method and apparatus for simultaneous television video presentation and separate viewing of different broadcasts
US20080284844A1 (en) * 2003-02-05 2008-11-20 Graham John Woodgate Switchable Lens
US20040164292A1 (en) * 2003-02-21 2004-08-26 Yeh-Jiun Tung Transflective display having an OLED backlight
US20070097208A1 (en) * 2003-05-28 2007-05-03 Satoshi Takemoto Stereoscopic image display apparatus, text data processing apparatus, program, and storing medium
US20050073472A1 (en) * 2003-07-26 2005-04-07 Samsung Electronics Co., Ltd. Method of removing Moire pattern in 3D image display apparatus using complete parallax
US20070097103A1 (en) * 2003-09-11 2007-05-03 Shoji Yoshioka Portable display device
US20070085814A1 (en) * 2003-09-20 2007-04-19 Koninklijke Philips Electronics N.V. Image display device
US20080273242A1 (en) * 2003-09-30 2008-11-06 Graham John Woodgate Directional Display Apparatus
US20050128353A1 (en) * 2003-12-16 2005-06-16 Young Bruce A. System and method for using second remote control device for sub-picture control in television receiver
US20050237487A1 (en) * 2004-04-23 2005-10-27 Chang Nelson L A Color wheel assembly for stereoscopic imaging
US7440193B2 (en) * 2004-04-30 2008-10-21 Gunasekaran R Alfred Wide-angle variable focal length lens system
US20060050785A1 (en) * 2004-09-09 2006-03-09 Nucore Technology Inc. Inserting a high resolution still image into a lower resolution video stream
US20070296874A1 (en) * 2004-10-20 2007-12-27 Fujitsu Ten Limited Display Device,Method of Adjusting the Image Quality of the Display Device, Device for Adjusting the Image Quality and Device for Adjusting the Contrast
US20090058845A1 (en) * 2004-10-20 2009-03-05 Yasuhiro Fukuda Display device
US20060109242A1 (en) * 2004-11-19 2006-05-25 Simpkins Daniel S User interface for impaired users
US20090232202A1 (en) * 2004-12-10 2009-09-17 Koninklijke Philips Electronics, N.V. Wireless video streaming using single layer coding and prioritized streaming
US20060139490A1 (en) * 2004-12-15 2006-06-29 Fekkes Wilhelmus F Synchronizing audio with delayed video
US20060139448A1 (en) * 2004-12-29 2006-06-29 Samsung Electronics Co., Ltd. 3D displays with flexible switching capability of 2D/3D viewing modes
US20090115800A1 (en) * 2005-01-18 2009-05-07 Koninklijke Philips Electronics, N.V. Multi-view display device
US20080192112A1 (en) * 2005-03-18 2008-08-14 Ntt Data Sanyo System Corporation Stereoscopic Image Display Apparatus, Stereoscopic Image Displaying Method And Computer Program Product
US20090174700A1 (en) * 2005-03-31 2009-07-09 Casio Computer Co., Ltd. Illuminator for emitting at least two lights having directivity and display apparatus using same
US20070139371A1 (en) * 2005-04-04 2007-06-21 Harsham Bret A Control system and method for differentiating multiple users utilizing multi-view display devices
US20080191964A1 (en) * 2005-04-22 2008-08-14 Koninklijke Philips Electronics, N.V. Auto-Stereoscopic Display With Mixed Mode For Concurrent Display of Two- and Three-Dimensional Images
US20080246757A1 (en) * 2005-04-25 2008-10-09 Masahiro Ito 3D Image Generation and Display System
US20060244918A1 (en) * 2005-04-27 2006-11-02 Actuality Systems, Inc. Minimized-thickness angular scanner of electromagnetic radiation
US20060256302A1 (en) * 2005-05-13 2006-11-16 Microsoft Corporation Three-dimensional (3D) image projection
US20060271791A1 (en) * 2005-05-27 2006-11-30 Sbc Knowledge Ventures, L.P. Method and system for biometric based access control of media content presentation devices
US20090051759A1 (en) * 2005-05-27 2009-02-26 Adkins Sean M Equipment and methods for the synchronization of stereoscopic projection displays
US20070096125A1 (en) * 2005-06-24 2007-05-03 Uwe Vogel Illumination device
US20070002041A1 (en) * 2005-07-02 2007-01-04 Samsung Electronics Co., Ltd. Method and apparatus for encoding/decoding video data to implement local three-dimensional video
US20070008406A1 (en) * 2005-07-08 2007-01-11 Samsung Electronics Co., Ltd. High resolution 2D-3D switchable autostereoscopic display apparatus
US20070008620A1 (en) * 2005-07-11 2007-01-11 Samsung Electronics Co., Ltd. Switchable autostereoscopic display
US20070052807A1 (en) * 2005-09-07 2007-03-08 Fuji Xerox Co., Ltd. System and method for user monitoring interface of 3-D video streams from multiple cameras
US7511774B2 (en) * 2005-11-30 2009-03-31 Samsung Mobile Display Co., Ltd. Three-dimensional display device
US20080259233A1 (en) * 2005-12-20 2008-10-23 Koninklijke Philips Electronics, N.V. Autostereoscopic Display Device
US20070146267A1 (en) * 2005-12-22 2007-06-28 Lg.Philips Lcd Co., Ltd. Display device and method of driving the same
US20070147827A1 (en) * 2005-12-28 2007-06-28 Arnold Sheynman Methods and apparatus for wireless stereo video streaming
US20070153916A1 (en) * 2005-12-30 2007-07-05 Sharp Laboratories Of America, Inc. Wireless video transmission system
US20070162392A1 (en) * 2006-01-12 2007-07-12 Microsoft Corporation Management of Streaming Content
US7359105B2 (en) * 2006-02-07 2008-04-15 Sharp Kabushiki Kaisha Spatial light modulator and a display device
US20090010264A1 (en) * 2006-03-21 2009-01-08 Huawei Technologies Co., Ltd. Method and System for Ensuring QoS and SLA Server
US20080133122A1 (en) * 2006-03-29 2008-06-05 Sanyo Electric Co., Ltd. Multiple visual display device and vehicle-mounted navigation system
US20080043096A1 (en) * 2006-04-04 2008-02-21 Anthony Vetro Method and System for Decoding and Displaying 3D Light Fields
US7626644B2 (en) * 2006-05-04 2009-12-01 Samsung Electronics Co., Ltd. Multiview autostereoscopic display
US20070258140A1 (en) * 2006-05-04 2007-11-08 Samsung Electronics Co., Ltd. Multiview autostereoscopic display
US20070270218A1 (en) * 2006-05-08 2007-11-22 Nintendo Co., Ltd. Storage medium having game program stored thereon and game apparatus
US20080025390A1 (en) * 2006-07-25 2008-01-31 Fang Shi Adaptive video frame interpolation
US20080037120A1 (en) * 2006-08-08 2008-02-14 Samsung Electronics Co., Ltd High resolution 2d/3d switchable display apparatus
US20080043644A1 (en) * 2006-08-18 2008-02-21 Microsoft Corporation Techniques to perform rate matching for multimedia conference calls
US20080126557A1 (en) * 2006-09-08 2008-05-29 Tetsuro Motoyama System, method, and computer program product using an SNMP implementation to obtain vendor information from remote devices
US20080068329A1 (en) * 2006-09-15 2008-03-20 Samsung Electronics Co., Ltd. Multi-view autostereoscopic display with improved resolution
US20080165176A1 (en) * 2006-09-28 2008-07-10 Charles Jens Archer Method of Video Display and Multiplayer Gaming
US20080150853A1 (en) * 2006-12-22 2008-06-26 Hong Kong Applied Science and Technology Research Institute Company Limited Backlight device and liquid crystal display incorporating the backlight device
US20080168129A1 (en) * 2007-01-08 2008-07-10 Jeffrey Robbin Pairing a Media Server and a Media Client
US20080303832A1 (en) * 2007-06-11 2008-12-11 Samsung Electronics Co., Ltd. Method of generating two-dimensional/three-dimensional convertible stereoscopic image bitstream and method and apparatus for displaying the same
US20090002178A1 (en) * 2007-06-29 2009-01-01 Microsoft Corporation Dynamic mood sensing
US20090052164A1 (en) * 2007-08-24 2009-02-26 Masako Kashiwagi Directional backlight, display apparatus, and stereoscopic display apparatus
US20090115783A1 (en) * 2007-11-02 2009-05-07 Dimension Technologies, Inc. 3d optical illusions from off-axis displays
US20090138805A1 (en) * 2007-11-21 2009-05-28 Gesturetek, Inc. Media preferences
US20090133051A1 (en) * 2007-11-21 2009-05-21 Gesturetek, Inc. Device access control
US20090141182A1 (en) * 2007-12-03 2009-06-04 Panasonic Corporation Digital broadcast receiving apparatus, semiconductor integrated circuit, and digital broadcast receiving method
US20090167639A1 (en) * 2008-01-02 2009-07-02 3M Innovative Properties Company Methods of reducing perceived image crosstalk in a multiview display
US20090238378A1 (en) * 2008-03-18 2009-09-24 Invism, Inc. Enhanced Immersive Soundscapes Production
US20090319625A1 (en) * 2008-06-20 2009-12-24 Alcatel Lucent Interactivity in a digital public signage network architecture

Cited By (82)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110102558A1 (en) * 2006-10-05 2011-05-05 Renaud Moliton Display device for stereoscopic display
US8896675B2 (en) * 2006-10-05 2014-11-25 Essilor International (Compagnie Generale D'optique) Display system for stereoscopic viewing implementing software for optimization of the system
US8922545B2 (en) 2009-12-31 2014-12-30 Broadcom Corporation Three-dimensional display system with adaptation based on viewing reference of viewer(s)
US9066092B2 (en) 2009-12-31 2015-06-23 Broadcom Corporation Communication infrastructure including simultaneous video pathways for multi-viewer support
US9049440B2 (en) 2009-12-31 2015-06-02 Broadcom Corporation Independent viewer tailoring of same media source content via a common 2D-3D display
US9019263B2 (en) 2009-12-31 2015-04-28 Broadcom Corporation Coordinated driving of adaptable light manipulator, backlighting and pixel array in support of adaptable 2D and 3D displays
US8823782B2 (en) 2009-12-31 2014-09-02 Broadcom Corporation Remote control with integrated position, viewer identification and optical and audio test
US9013546B2 (en) 2009-12-31 2015-04-21 Broadcom Corporation Adaptable media stream servicing two and three dimensional content
US9124885B2 (en) 2009-12-31 2015-09-01 Broadcom Corporation Operating system supporting mixed 2D, stereoscopic 3D and multi-view 3D displays
US9247286B2 (en) 2009-12-31 2016-01-26 Broadcom Corporation Frame formatting supporting mixed two and three dimensional video data communication
US9979954B2 (en) 2009-12-31 2018-05-22 Avago Technologies General Ip (Singapore) Pte. Ltd. Eyewear with time shared viewing supporting delivery of differing content to multiple viewers
US8854531B2 (en) 2009-12-31 2014-10-07 Broadcom Corporation Multiple remote controllers that each simultaneously controls a different visual presentation of a 2D/3D display
US8988506B2 (en) 2009-12-31 2015-03-24 Broadcom Corporation Transcoder supporting selective delivery of 2D, stereoscopic 3D, and multi-view 3D content from source video
US8687042B2 (en) 2009-12-31 2014-04-01 Broadcom Corporation Set-top box circuitry supporting 2D and 3D content reductions to accommodate viewing environment constraints
US8964013B2 (en) 2009-12-31 2015-02-24 Broadcom Corporation Display with elastic light manipulator
US20110164115A1 (en) * 2009-12-31 2011-07-07 Broadcom Corporation Transcoder supporting selective delivery of 2d, stereoscopic 3d, and multi-view 3d content from source video
US9143770B2 (en) 2009-12-31 2015-09-22 Broadcom Corporation Application programming interface supporting mixed two and three dimensional displays
US8767050B2 (en) 2009-12-31 2014-07-01 Broadcom Corporation Display supporting multiple simultaneous 3D views
US20110157326A1 (en) * 2009-12-31 2011-06-30 Broadcom Corporation Multi-path and multi-source 3d content storage, retrieval, and delivery
US20110164188A1 (en) * 2009-12-31 2011-07-07 Broadcom Corporation Remote control with integrated position, viewer identification and optical and audio test
US9654767B2 (en) 2009-12-31 2017-05-16 Avago Technologies General Ip (Singapore) Pte. Ltd. Programming architecture supporting mixed two and three dimensional displays
US9204138B2 (en) 2009-12-31 2015-12-01 Broadcom Corporation User controlled regional display of mixed two and three dimensional content
US9247232B2 (en) * 2010-04-16 2016-01-26 Samsung Electronics Co., Ltd. Display apparatus, 3D glasses, and display system including the same
US8767051B2 (en) * 2010-04-16 2014-07-01 Samsung Electronics Co., Ltd. Display apparatus, 3D glasses, and display system including the same
US20140313297A1 (en) * 2010-04-16 2014-10-23 Samsung Electronics Co., Ltd. Display apparatus, 3d glasses, and display system including the same
US20110254934A1 (en) * 2010-04-16 2011-10-20 Samsung Electronics Co., Ltd. Display apparatus, 3d glasses, and display system including the same
US20120105610A1 (en) * 2010-11-03 2012-05-03 Samsung Electronics Co., Ltd. Method and apparatus for providing 3d effect in video device
US20120148055A1 (en) * 2010-12-13 2012-06-14 Samsung Electronics Co., Ltd. Audio processing apparatus, audio receiver and method for providing audio thereof
US8913104B2 (en) * 2011-05-24 2014-12-16 Bose Corporation Audio synchronization for two dimensional and three dimensional video signals
US20120300026A1 (en) * 2011-05-24 2012-11-29 William Allen Audio-Video Signal Processing
US20120307048A1 (en) * 2011-05-30 2012-12-06 Sony Ericsson Mobile Communications Ab Sensor-based placement of sound in video recording
US9084068B2 (en) * 2011-05-30 2015-07-14 Sony Corporation Sensor-based placement of sound in video recording
CN103703772A (en) * 2011-07-18 2014-04-02 三星电子株式会社 Content playing method and apparatus
EP2735164A1 (en) * 2011-07-18 2014-05-28 Samsung Electronics Co., Ltd. Content playing method and apparatus
EP2735164A4 (en) * 2011-07-18 2015-04-29 Samsung Electronics Co Ltd Content playing method and apparatus
EP2637416A1 (en) * 2012-03-06 2013-09-11 Alcatel Lucent A system and method for optimized streaming of variable multi-viewpoint media
US9201495B2 (en) * 2012-04-24 2015-12-01 Mobitv, Inc. Control of perspective in multi-dimensional media
US20130278732A1 (en) * 2012-04-24 2013-10-24 Mobitv, Inc. Control of perspective in multi-dimensional media
US9800862B2 (en) * 2012-06-12 2017-10-24 The Board Of Trustees Of The University Of Illinois System and methods for visualizing information
US20130328777A1 (en) * 2012-06-12 2013-12-12 Andrew Johnson System and methods for visualizing information
US20140362196A1 (en) * 2012-08-03 2014-12-11 Samsung Electronics Co., Ltd. Display apparatus which displays a plurality of content views, glasses apparatus which synchronizes with one of the content views, and methods thereof
US20140036044A1 (en) * 2012-08-03 2014-02-06 Samsung Electronics Co., Ltd. Display apparatus which displays a plurality of content views, glasses apparatus which synchronizes with one of the content views, and methods thereof
US8842169B2 (en) * 2012-08-03 2014-09-23 Samsung Electronics Co., Ltd. Display apparatus which displays a plurality of content views, glasses apparatus which synchronizes with one of the content views, and methods thereof
US20150316640A1 (en) * 2012-09-17 2015-11-05 Nokia Technologies Oy Method and apparatus for associating audio objects with content and geo-location
US9612311B2 (en) * 2012-09-17 2017-04-04 Nokia Technologies Oy Method and apparatus for associating audio objects with content and geo-location
US9265458B2 (en) 2012-12-04 2016-02-23 Sync-Think, Inc. Application of smooth pursuit cognitive testing paradigms to clinical drug development
US9047054B1 (en) * 2012-12-20 2015-06-02 Audible, Inc. User location-based management of content presentation
US9591071B2 (en) 2012-12-20 2017-03-07 Audible, Inc. User location-based management of content presentation
US9380976B2 (en) 2013-03-11 2016-07-05 Sync-Think, Inc. Optical neuroinformatics
US20140328505A1 (en) * 2013-05-02 2014-11-06 Microsoft Corporation Sound field adaptation based upon user tracking
WO2014179633A1 (en) * 2013-05-02 2014-11-06 Microsoft Corporation Sound field adaptation based upon user tracking
US9591295B2 (en) * 2013-09-24 2017-03-07 Amazon Technologies, Inc. Approaches for simulating three-dimensional views
US20150085076A1 (en) * 2013-09-24 2015-03-26 Amazon Techologies, Inc. Approaches for simulating three-dimensional views
EP2963951A1 (en) * 2014-07-02 2016-01-06 Samsung Electronics Co., Ltd Method, user terminal, and audio system, for speaker location detection and level control using magnetic field
US9906864B2 (en) 2014-07-02 2018-02-27 Samsung Electronics Co., Ltd. Method, user terminal, and audio system, for speaker location detection and level control using magnetic field
WO2017156622A1 (en) * 2016-03-13 2017-09-21 Rising Sun Productions Limited Head-mounted audiovisual capture device
US20170366914A1 (en) * 2016-06-17 2017-12-21 Edward Stein Audio rendering using 6-dof tracking
US9973874B2 (en) * 2016-06-17 2018-05-15 Dts, Inc. Audio rendering using 6-DOF tracking
US10820134B2 (en) 2016-06-17 2020-10-27 Dts, Inc. Near-field binaural rendering
US10200806B2 (en) 2016-06-17 2019-02-05 Dts, Inc. Near-field binaural rendering
US10231073B2 (en) 2016-06-17 2019-03-12 Dts, Inc. Ambisonic audio rendering with depth decoding
US10235010B2 (en) 2016-07-28 2019-03-19 Canon Kabushiki Kaisha Information processing apparatus configured to generate an audio signal corresponding to a virtual viewpoint image, information processing system, information processing method, and non-transitory computer-readable storage medium
EP3276982A1 (en) * 2016-07-28 2018-01-31 Canon Kabushiki Kaisha Information processing apparatus, information processing method, and program
US10664128B2 (en) 2016-07-28 2020-05-26 Canon Kabushiki Kaisha Information processing apparatus, configured to generate an audio signal corresponding to a virtual viewpoint image, information processing system, information processing method, and non-transitory computer-readable storage medium
US20180046431A1 (en) * 2016-08-10 2018-02-15 Qualcomm Incorporated Multimedia device for processing spatialized audio based on movement
US10089063B2 (en) * 2016-08-10 2018-10-02 Qualcomm Incorporated Multimedia device for processing spatialized audio based on movement
US10514887B2 (en) 2016-08-10 2019-12-24 Qualcomm Incorporated Multimedia device for processing spatialized audio based on movement
US20180176545A1 (en) * 2016-11-25 2018-06-21 Nokia Technologies Oy Virtual reality display
US10491886B2 (en) * 2016-11-25 2019-11-26 Nokia Technologies Oy Virtual reality display
US10802324B2 (en) 2017-03-14 2020-10-13 Boe Technology Group Co., Ltd. Double vision display method and device
US11443487B2 (en) * 2017-06-30 2022-09-13 Nokia Technologies Oy Methods, apparatus, systems, computer programs for enabling consumption of virtual content for mediated reality
US10777057B1 (en) * 2017-11-30 2020-09-15 Amazon Technologies, Inc. Premises security system with audio simulating occupancy
CN111512640A (en) * 2017-12-20 2020-08-07 诺基亚技术有限公司 Multi-camera device
WO2019123149A1 (en) * 2017-12-20 2019-06-27 Nokia Technologies Oy Multi-camera device
EP3503579A1 (en) * 2017-12-20 2019-06-26 Nokia Technologies Oy Multi-camera device
US11503226B2 (en) 2017-12-20 2022-11-15 Nokia Technologies Oy Multi-camera device
US10609503B2 (en) 2018-04-08 2020-03-31 Dts, Inc. Ambisonic depth extraction
US20220248088A1 (en) * 2018-04-11 2022-08-04 Alcacruz Inc. Digital media system
US11589110B2 (en) * 2018-04-11 2023-02-21 Alcacruz Inc. Digital media system
US11337020B2 (en) 2018-06-07 2022-05-17 Nokia Technologies Oy Controlling rendering of a spatial audio scene
US10932080B2 (en) 2019-02-14 2021-02-23 Microsoft Technology Licensing, Llc Multi-sensor object tracking for modifying audio
US20220068185A1 (en) * 2019-04-29 2022-03-03 Hewlett-Packard Development Company, L.P. Wireless configuration of display attribute

Also Published As

Publication number Publication date
US20150264341A1 (en) 2015-09-17
US8767050B2 (en) 2014-07-01
CN102183840A (en) 2011-09-14
US20110157326A1 (en) 2011-06-30
US9049440B2 (en) 2015-06-02
US20110157339A1 (en) 2011-06-30
US8988506B2 (en) 2015-03-24
US20110157167A1 (en) 2011-06-30
TW201142357A (en) 2011-12-01
US20110157168A1 (en) 2011-06-30
US9013546B2 (en) 2015-04-21
US9066092B2 (en) 2015-06-23
US20110157696A1 (en) 2011-06-30
US20150156473A1 (en) 2015-06-04
US20110157170A1 (en) 2011-06-30
EP2357631A1 (en) 2011-08-17
TWI467234B (en) 2015-01-01
US9143770B2 (en) 2015-09-22
US20110164111A1 (en) 2011-07-07
EP2346021B1 (en) 2014-11-19
US20110164034A1 (en) 2011-07-07
US20110169913A1 (en) 2011-07-14
US8687042B2 (en) 2014-04-01
CN102183841A (en) 2011-09-14
US20110157309A1 (en) 2011-06-30
TW201142356A (en) 2011-12-01
US20110157264A1 (en) 2011-06-30
EP2357508A1 (en) 2011-08-17
US20110157471A1 (en) 2011-06-30
TW201137399A (en) 2011-11-01
US9204138B2 (en) 2015-12-01
CN102183841B (en) 2014-04-02
US8922545B2 (en) 2014-12-30
EP2346021A1 (en) 2011-07-20
CN102215408A (en) 2011-10-12
US8964013B2 (en) 2015-02-24
US20110169930A1 (en) 2011-07-14
EP2357630A1 (en) 2011-08-17
US20110157322A1 (en) 2011-06-30
US9654767B2 (en) 2017-05-16
US20110157172A1 (en) 2011-06-30
US20110157169A1 (en) 2011-06-30
HK1161754A1 (en) 2012-08-03
US20110157336A1 (en) 2011-06-30
US9979954B2 (en) 2018-05-22
US20110157315A1 (en) 2011-06-30
US9019263B2 (en) 2015-04-28
US20110161843A1 (en) 2011-06-30
US20110157257A1 (en) 2011-06-30
US20110157330A1 (en) 2011-06-30
US20110164115A1 (en) 2011-07-07
US20150015668A1 (en) 2015-01-15
US9124885B2 (en) 2015-09-01
US20110157697A1 (en) 2011-06-30

Similar Documents

Publication Publication Date Title
US20110157327A1 (en) 3d audio delivery accompanying 3d display supported by viewer/listener position and orientation tracking
US8823782B2 (en) Remote control with integrated position, viewer identification and optical and audio test
US10979845B1 (en) Audio augmentation using environmental data
US20050264857A1 (en) Binaural horizontal perspective display
CN109691141B (en) Spatialization audio system and method for rendering spatialization audio
US20120128184A1 (en) Display apparatus and sound control method of the display apparatus
EP4242829A2 (en) Audio apparatus and method of audio processing
WO2022262839A1 (en) Stereoscopic display method and apparatus for live performance, medium, and system
EP3595337A1 (en) Audio apparatus and method of audio processing
US20210211826A1 (en) Method and apparatus for providing audio content in immersive reality
JP2016046699A (en) Image voice input/output system
JP2020501275A (en) Image processing system and method
JP5533282B2 (en) Sound playback device
EP3402410A1 (en) Detection system
JP4955718B2 (en) Stereoscopic display control apparatus, stereoscopic display system, and stereoscopic display control method
KR20180118034A (en) Apparatus and method for controlling spatial audio according to eye tracking
GB2558279A (en) Head mountable display system
WO2012062174A1 (en) Single-screen multi-frequency synchronized 2d and 3d audio-visual system
JP2011234139A (en) Three-dimensional audio signal generating device
NL2030186B1 (en) Autostereoscopic display device presenting 3d-view and 3d-sound
KR20190064394A (en) 360 degree VR partition circle vision display apparatus and method thereof
GB2569576A (en) Audio generation system
JP2012253707A (en) Stereoscopic video display device and sound reproduction device
KR20130054569A (en) Apparatus and the method for implementation 3d sound according to head pose
JP2013172388A (en) Display device and program

Legal Events

Date Code Title Description
AS Assignment

Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH CAROLINA

Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001

Effective date: 20160201

Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH

Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001

Effective date: 20160201

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION

AS Assignment

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD., SINGAPORE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001

Effective date: 20170120

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001

Effective date: 20170120

AS Assignment

Owner name: BROADCOM CORPORATION, CALIFORNIA

Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041712/0001

Effective date: 20170119