US20110157315A1 - Interpolation of three-dimensional video content - Google Patents
Interpolation of three-dimensional video content Download PDFInfo
- Publication number
- US20110157315A1 US20110157315A1 US12/982,248 US98224810A US2011157315A1 US 20110157315 A1 US20110157315 A1 US 20110157315A1 US 98224810 A US98224810 A US 98224810A US 2011157315 A1 US2011157315 A1 US 2011157315A1
- Authority
- US
- United States
- Prior art keywords
- interpolation
- encoded
- frame
- processing circuitry
- video content
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/361—Reproducing mixed stereoscopic images; Reproducing mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/18—Stereoscopic photography by simultaneous viewing
- G03B35/24—Stereoscopic photography by simultaneous viewing using apertured or refractive resolving means on screens or between screen and eye
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
- G09G3/003—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/20—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/139—Format conversion, e.g. of frame-rate or size
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/161—Encoding, multiplexing or demultiplexing different image signal components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/189—Recording image signals; Reproducing recorded image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/194—Transmission of image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/302—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
- H04N13/305—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/302—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
- H04N13/31—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/302—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
- H04N13/31—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
- H04N13/312—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers the parallax barriers being placed behind the display panel, e.g. between backlight and spatial light modulator [SLM]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/302—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
- H04N13/31—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
- H04N13/315—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers the parallax barriers being time-variant
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/349—Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking
- H04N13/351—Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking for displaying simultaneously
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/356—Image reproducers having separate monoscopic and stereoscopic modes
- H04N13/359—Switching between monoscopic and stereoscopic modes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
- H04N13/383—Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/398—Synchronisation thereof; Control thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/235—Processing of additional data, e.g. scrambling of additional data or processing content descriptors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4122—Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/435—Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04S—STEREOPHONIC SYSTEMS
- H04S7/00—Indicating arrangements; Control arrangements, e.g. balance control
- H04S7/30—Control circuits for electronic adaptation of the sound field
- H04S7/302—Electronic adaptation of stereophonic sound system to listener position or orientation
- H04S7/303—Tracking of listener position or orientation
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B6/00—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2300/00—Aspects of the constitution of display devices
- G09G2300/02—Composition of display devices
- G09G2300/023—Display panel composed of stacked panels
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/02—Improving the quality of display appearance
- G09G2320/028—Improving the quality of display appearance by changing the viewing angle properties, e.g. widening the viewing angle, adapting the viewing angle to the view direction
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/04—Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/003—Details of a display terminal, the details relating to the control arrangement of the display terminal and to the interfaces thereto
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/14—Display of multiple viewports
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N2013/40—Privacy aspects, i.e. devices showing different images to different viewers, the images not being viewpoints of the same scene
- H04N2013/403—Privacy aspects, i.e. devices showing different images to different viewers, the images not being viewpoints of the same scene the images being monoscopic
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N2013/40—Privacy aspects, i.e. devices showing different images to different viewers, the images not being viewpoints of the same scene
- H04N2013/405—Privacy aspects, i.e. devices showing different images to different viewers, the images not being viewpoints of the same scene the images being stereoscopic or three dimensional
Definitions
- the present invention relates to techniques for processing video images.
- Images may be transmitted for display in various forms.
- television is a widely used telecommunication medium for transmitting and displaying images in monochromatic (“black and white”) or color form.
- images are provided in analog form and are displayed by display devices in the form of two-dimensional images. More recently, images are being provided in digital form for display in two-dimensions on display devices having improved resolution. Even more recently, images capable of being displayed in three-dimensions are being provided.
- Conventional displays may use a variety of techniques to achieve three-dimensional image viewing functionality.
- various types of glasses have been developed that may be worn by users to view three-dimensional images displayed by a conventional display.
- glasses include glasses that utilize color filters or polarized filters.
- the lenses of the glasses pass two-dimensional images of differing perspective to the user's left and right eyes.
- the images are combined in the visual center of the brain of the user to be perceived as a three-dimensional image.
- synchronized left eye, right eye LCD (liquid crystal display) shutter glasses may be used with conventional two-dimensional displays to create a three-dimensional viewing illusion.
- LCD display glasses may be used to display three-dimensional images to a user.
- the lenses of the LCD display glasses include corresponding displays that provide images of differing perspective to the user's eyes, to be perceived by the user as three-dimensional.
- a display may include a parallax barrier that has a layer of material with a series of precision slits. The parallax barrier is placed proximal to a display so that a user's eyes each see a different set of pixels to create a sense of depth through parallax.
- Another type of display for viewing three-dimensional images is one that includes a lenticular lens.
- a lenticular lens includes an array of magnifying lenses configured so that when viewed from slightly different angles, different images are magnified. Displays are being developed that use lenticular lenses to enable autostereoscopic images to be generated.
- Each technique for achieving three-dimensional image viewing functionality involves transmitting three-dimensional video content to a display device, so that the display device can display three-dimensional images that are represented by the three-dimensional video content to a user.
- a variety of issues may arise with respect to such transmission. For example, errors that occur during the transmission may cause frame data in the video content to become corrupted.
- a source of the video content and/or the channels through which the video content is transferred may become temporarily unable to handle a load that is imposed by the video content.
- the display device may be capable of processing frame data of a greater number of perspectives than the source is capable of providing.
- FIG. 1 is a block diagram of an exemplary system for generating three-dimensional video content that may be encoded in accordance with an embodiment.
- FIG. 2 is a block diagram of an exemplary display system according to an embodiment.
- FIG. 3 depicts an exemplary implementation of an encoding system shown in FIG. 2 in accordance with an embodiment.
- FIGS. 4-9 show flowcharts of exemplary methods for encoding portions of three-dimensional video content for subsequent interpolation according to embodiments.
- FIG. 10 depicts an exemplary implementation of a decoding system shown in FIG. 2 in accordance with an embodiment.
- FIGS. 11-16 show flowcharts of exemplary methods for decoding portions of encoded three-dimensional video content using interpolation according to embodiments.
- FIGS. 17-20 illustrate exemplary interpolation techniques according to embodiments.
- FIG. 21 is a block diagram of an exemplary electronic device according to an embodiment.
- references in the specification to “one embodiment,” “an embodiment,” “an example embodiment,” or the like, indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Furthermore, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the relevant art(s) to implement such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
- Example embodiments relate to interpolation of three-dimensional video content.
- Three-dimensional video content is video content that includes portions representing respective frame sequences that provide respective perspective views of a given subject matter over the same period of time.
- an upstream device analyzes the three-dimensional video content to identify one or more interpolation opportunities.
- An interpolation opportunity occurs when a target perspective view that is associated with the three-dimensional video content is between reference perspective views that are associated with the three-dimensional video content.
- the target perspective view and the reference perspective views are perspective views of a common video event that are provided by respective sequences of frames (alternatively referred to herein as “images” or “pictures”) that are represented by respective portions of the three-dimensional video content.
- three-dimensional video content includes portions PA, PB, and PC that represent respective perspective views VA, VB, and VC for illustrative purposes. Further assume that VB is between VA and VC. In accordance with this example, an interpolation opportunity is said to occur for providing an interpolated representation of PB based on PA and PC.
- frame data that is associated with the interpolation opportunity may be replaced with an interpolation marker.
- the upstream device may replace PB with the interpolation marker.
- the downstream device may replace the interpolation marker with an interpolated representation of the frame data that the interpolation marker replaced. For instance, the downstream device may interpolate between the portions of the three-dimensional video content that represent the sequences of frames that provide the reference perspective views to generate an interpolated representation (a.k.a. an interpolation) of the portion of the three-dimensional video content that represents the sequence of frames that provides the target perspective view. In accordance with the example mentioned above, the downstream device may interpolate between PA and PC to generate an interpolated representation of PB.
- the downstream device identifies a frame that is not directly represented by data that is included in the three-dimensional video content.
- the frame may be represented by an interpolation marker.
- the downstream device may perform an interpolation operation with respect to portions of the three-dimensional video content even in the absence of an interpolation marker.
- the data may be corrupted.
- the frame may be missing from the data, or a portion of the data that corresponds to the frame may include erroneous data. Accordingly, the interpolation need not necessarily be performed in response to an interpolation marker.
- the embodiments described herein have a variety of benefits as compared to conventional techniques for processing video content.
- the embodiments may increase the likelihood that a source of the video content and/or the channels through which the video content is transferred are capable of handling a load that is imposed by the video content.
- the embodiments may be capable of increasing the number of perspectives that are provided by the video content.
- the embodiments may be capable of correcting corrupted data that is included in the video content based on other data in the video content. For instance, the corrupted data may be corrected on the fly using one or more of the techniques described herein.
- three-dimensional video content is represented as a plurality of separate portions (a.k.a. digital video streams). Each portion represents a respective frame sequence that provides a respective perspective view of a video event.
- FIG. 1 is a diagram of an exemplary system 100 for generating three-dimensional video content that may be encoded in accordance with an embodiment.
- system 100 includes a plurality of video cameras 102 A- 102 N that are directed at and operate to record images of the same subject matter 104 from different perspectives over the same period of time.
- This results in the generation three-dimensional video content 106 which includes N different portions 108 A- 108 N that provide different perspective views of subject matter 104 over the same period of time.
- one or more of the portions 108 A- 108 N may be created in a manual or automated fashion by digital animators using advanced graphics and animation tools.
- at least one of the portions 108 A- 108 N may be created by using a manual or automated interpolation process that creates a portion based on analysis of at least two of the other portions. For example, with reference to FIG. 1 , if camera 102 B were absent, a digital video stream corresponding to the perspective view of subject matter 104 provided by that camera could nevertheless be created by performing an interpolation process on the portions of the three-dimensional video content 106 produced by camera 102 A and another of the cameras. Still other techniques not described herein may be used to produce one or more of the different digital video streams.
- Display systems have been described that can display a single image of certain subject matter to provide a two-dimensional view thereof and that can also display two images of the same subject matter viewed from different perspectives in an integrated manner to provide a three-dimensional view thereof.
- Such two-dimensional (2D)/three-dimensional (3D) display systems can further display a multiple of two images (e.g., four images, eight images, etc.) of the same subject matter viewed from different perspectives in an integrated manner to simultaneously provide multiple three-dimensional views thereof, wherein the particular three-dimensional view perceived by a viewer is determined based at least in part on the position of the viewer. Examples of such 2D/3D display systems are described in the following commonly-owned, co-pending U.S. Patent Applications: U.S. patent application Ser. No.
- the portions 108 A- 108 N produced by system 100 can be obtained and provided to a 2D/3D display system as described above in order to facilitate the presentation of a two-dimensional view of subject matter 104 , a single three-dimensional view of subject matter 104 , or multiple three-dimensional views of subject matter 104 .
- FIG. 2 is a block diagram of an exemplary display system 200 according to an embodiment.
- display system 200 operates to transmit three-dimensional video content, such as three-dimensional video content 106 of FIG. 1 , to a display device, so that the display device can display three-dimensional images that are represented by the three-dimensional video content to user(s).
- display system 200 interpolates between portions of the three-dimensional video content that correspond to respective perspective views to provide frame data that corresponds to another perspective view.
- display system 200 includes source(s) 202 and a display device 204 .
- Source(s) provide three-dimensional video content 206 .
- Source(s) 202 can include any number of sources, including one, two, three, etc.
- Each source provides one or more portions of the three-dimensional video content 206 .
- Examples of a source include but are not limited to a computer storage disc (e.g., a digital video disc (DVD) or a Blu-Ray® disc), local storage on a display device, a remote server (i.e., a server that is located remotely from the display device), a gaming system, a satellite, a cable headend, and a point-to-point system.
- the reference portions may include 2D data, 3D2 data, 3D4 data, 3D8 data, etc.
- Supplemental portions may include auto-interpolated 2D-3D2 (single stream) data, manually generated interpolation 3D2 data, A-I 3D4 (3 stream) data, M-G-I 3D4 (3 stream) data, etc.
- source(s) 202 includes an encoding system 208 .
- Encoding system 208 encodes the three-dimensional video content 206 to provide encoded three-dimensional video content 210 .
- encoding system 208 may replace frame data in the three-dimensional video content 206 with an interpolation marker.
- the interpolation marker may indicate that interpolation is to be performed between portions of the three-dimensional video content in order to generate an interpolated representation of the frame data that is replaced with the interpolation marker.
- the interpolation marker may be accompanied by instructions for generating the interpolated representation. It will be recognized, however, that encoding system 208 need not necessarily replace frame data in the three-dimensional video content 206 with an interpolation marker. Regardless, encoding system 208 transmits the encoded three-dimensional video content 210 toward display device 204 via communication channels 212 .
- source(s) 202 need not necessarily include encoding system 208 .
- source(s) 202 may store the encoded three-dimensional video content 210 , rather than generating the encoded three-dimensional video content 210 based on the three-dimensional video content 206 .
- Communication channels 212 may include one or more local device pathways, point-to-point links, and/or pathways in a hybrid fiber coaxial (HFC) network, a wide-area network (e.g., the Internet), a local area network (LAN), another type of network, or a combination thereof.
- Communication channels 212 may support wired, wireless, or both transmission media, including satellite, terrestrial (e.g., fiber optic, copper, twisted pair, coaxial, or the like), radio, microwave, free-space optics, and/or any other form or method of transmission.
- Display device 204 displays images to user(s) upon receipt of the encoded three-dimensional video content 210 .
- Display device 204 may be implemented in various ways.
- display device 204 may be a television display (e.g., a liquid crystal display (LCD) television, a plasma television, etc.), a computer monitor, a projection system, or any other type of display device.
- LCD liquid crystal display
- Display device 204 includes an interpolation-enabled decoding system 214 , display circuitry 216 , and a screen 218 .
- Decoding system 214 decodes the encoded three-dimensional video content 210 to provide decoded three-dimensional video content 220 .
- decoding system 214 may interpolate between portions of a decoded representation of the encoded three-dimensional video content 210 to generate one or more of the portions of the decoded three-dimensional video content 220 .
- decoding system 214 may interpolate in response to detecting an interpolation indicator in the encoded three-dimensional video content 210 .
- decoding system 214 may interpolate in response to determining that a frame that is included in the decoded representation of the encoded three-dimensional video content 210 is not directly represented by data in the decoded representation. For instance, decoding system 214 may determine that the frame is replaced by an interpolation marker, that the frame is missing from the data, or that a portion of the data that corresponds to the frame includes erroneous data. Interpolation that is performed by decoding system 214 may be incorporated into a decoding process or may be performed after such a decoding process on raw data.
- decoding system 214 maintains synchronization of the portions that are included in the decoded three-dimensional video content 220 . For instance, such synchronization may be maintained during inter-reference frame periods, during screen reconfiguration, etc. If decoding system 214 is unable to maintain synchronization with respect to one or more portions of the decoded three-dimensional video content 220 , decoding system 214 may perform interpolation to generate interpolated representations of those portion(s) until synchronization is re-established. Decoding system 214 may synchronize 3DN adjustments with reference frame occurrence, where N can be any positive integer greater than or equal to two. A 3DN adjustment may include the addition of frame data corresponding to a perspective view, for example. For each additional perspective that is represented by the decoded three-dimensional video content 220 , N is incremented by one.
- Display circuitry 216 directs display of one or more of the frame sequences that are represented by the decoded three-dimensional video content 220 toward screen 218 , as indicated by arrow 222 , for presentation to the user(s). It will be recognized that although display circuitry 216 is labeled as such, the functionality of display circuitry 216 may be implemented in hardware, software, firmware, or any combination thereof.
- Screen 218 displays the frame sequence(s) that are received from display circuitry 216 to the user(s).
- Screen 218 may be any suitable type of screen, including but not limited to an LCD screen, a plasma screen, a light emitting device (LED) screen (e.g., an OLED (organic LED) screen), etc.
- LED light emitting device
- encoding system 208 may be external to source(s) 202 .
- decoding system 214 may be external to display device 204 .
- encoding system 208 and decoding system 214 may be implemented in a common device, such as a transcoder that is coupled between source(s) 202 and display device 204 .
- feedback may be provided from communication channels 212 and/or display device 204 to any one or more of the source(s) 202 .
- display device 204 may provide feedback to indicate an error that occurs with respect to frame data that is included in encoded three-dimensional video content 210 , one or more characteristics that are associated with display device 204 , etc. Examples of such characteristics include but are not limited to a load that is associated with display device 204 and a number of perspective views that display device 204 is capable of processing.
- channels 212 may provide feedback to indicate an error that occurs with respect to frame data that is included in encoded three-dimensional video content 210 , one or more characteristics (e.g., a load) that are associated with the channels 212 , etc.
- FIG. 3 depicts a block diagram of an encoding system 300 , which is an exemplary implementation of encoding system 208 of FIG. 2 , in accordance with an embodiment.
- encoding system 300 includes input circuitry 302 , processing circuitry 304 , and output circuitry 306 .
- Input circuitry 302 serves as an input interface for encoding system 300 .
- Processing circuitry 304 receives a plurality of portions 310 A- 310 N of three-dimensional video content 308 through input circuitry 302 . Each of the portions 310 A- 310 N represents a respective sequence of frames that provides a respective perspective view of a video event.
- Processing circuitry 304 encodes the portions 310 A- 310 N to provide encoded portions 314 A- 314 N.
- Processing circuitry 304 analyzes at least some of the portions 310 A- 310 N to identify one or more interpolation opportunities.
- An interpolation opportunity occurs when a target perspective view that is associated with the three-dimensional video content 308 is between reference perspective views that are associated with the three-dimensional video content 308 .
- the target perspective view and the reference perspective views are provided by respective sequences of frames that are represented by respective portions of the three-dimensional video content 308 .
- processing circuitry 304 replaces frame data that is included in the corresponding portion of the three-dimensional video content 308 with an interpolation marker.
- processing circuitry 304 identifies an interpolation opportunity in each of first portion 310 A and second portion 310 B, processing circuitry replaces frame data that is included in first portion 310 A with an interpolation marker and replaces frame data that is included in second portion 310 B with another interpolation marker.
- any one or more of the interpolation marker(s) may be accompanied by an interpolation instruction.
- a first interpolation instruction that corresponds to a first interpolation marker may specify which of the portions 310 A- 310 N of the three-dimensional video content 308 are to be used for generating an interpolated representation of the frame data that the first interpolation marker replaces.
- a second interpolation instruction that corresponds to a second interpolation marker may specify which of the portions 310 A- 310 N are to be used for generating an interpolated representation of the frame data that the second interpolation marker replaces, and so on.
- Each interpolation marker may specify a type of interpolation to be performed to generate an interpolated representation of the frame data that the interpolation marker replaces. For instance, a first type of interpolation may assign a first weight to a first reference portion of the three-dimensional video content 308 and a second weight that is different from the first weight to a second reference portion of the three-dimensional video content 308 for generating an interpolated representation of frame data. A second type of interpolation may assign equal weights to the first and second reference portions of the three-dimensional video content 308 .
- Other exemplary types of interpolation include but are not limited to linear interpolation, polynomial interpolation, and spline interpolation.
- Output circuitry 306 serves as an output interface for encoding system 300 .
- Processing circuitry 304 delivers encoded three-dimensional video content 312 that includes encoded portions 314 A- 314 N through output circuitry 306 .
- FIGS. 4-9 show flowcharts 400 , 500 , 600 , 700 , 800 , and 900 of exemplary methods for encoding portions of three-dimensional video content for subsequent interpolation according to embodiments.
- Flowcharts 400 , 500 , 600 , 700 , 800 , and 900 may be performed by encoding system 300 shown in FIG. 3 , for example.
- the methods of flowcharts 400 , 500 , 600 , 700 , 800 , and 900 are not limited to that embodiment.
- the basic approach involves encoder processing of at least a first sequence of frames and a second sequence of frames, wherein the first sequence represents a first perspective view (e.g., a right eye view) while the second sequence represents a second perspective view (e.g., a left eye view).
- first sequence represents a first perspective view (e.g., a right eye view)
- second sequence represents a second perspective view (e.g., a left eye view).
- many frames will be encoded based on the frame itself (no referencing to other frames), internal referencing (referencing frames within the same sequence of frames), and external referencing (referencing frames outside of the current frame's sequence of frames).
- interpolation information may be nothing more than an indicator or marker (an “interpolation marker”) but may also contain interpolation instructions, data and parameters.
- a current frame is encoded to determine the size of the resultant encoded frame data. If the size is less than an established threshold, interpolation may not be applied. But, for example, if the current frame offers a justifiable data savings and without being referenced, it is a prime candidate for considering interpolation.
- the encoder processing involves applying at least one but, depending on the embodiment, may apply multiple interpolation approaches (along with various underlying parameters variations). If only one approach is applied, a determination is made as to whether such interpolation can be used to yield a visually acceptable output. When multiple approaches are available, a selection therefrom of (i) a best match which is also determined to be visually acceptable, (ii) the first match that can be used to yield something visually acceptable, (iv) an acceptable match selected at least in part based on the ease of decoding and/or the size of the interpolation information, or etc.
- the encoder processing involves selecting to use the interpolation information (or use nothing to force default interpolation by a decoder) instead of the encoded frame data in subsequent storage and/or transmissions.
- an interpolation opportunity might involve replacing the middle frame in the sequence with nothing at all, to force the decoder to interpolate between the first frame sequence and the third frame sequence.
- a marker an interpolation marker
- a decoder might either (i) substitute the first or the third frame data for the missing second frame data, which is likely to not be noticed by a viewer due to the relatively short frame rate period, (ii) create a substitute for the missing second frame data by creating an average between the first frame and the second frame (e.g., a 50/50 “weighted” addition), or (iii) otherwise create a substitute based on weighted addition percentage or using some other interpolation approach that may utilize interpolation parameters, filters and other data.
- a decoder might either (i) substitute the first or the third frame data for the missing second frame data, which is likely to not be noticed by a viewer due to the relatively short frame rate period, (ii) create a substitute for the missing second frame data by creating an average between the first frame and the second frame (e.g., a 50/50 “weighted” addition), or (iii) otherwise create a substitute based on weighted addition percentage or using some other interpolation approach that may utilize interpolation parameters
- a first and second frame sequence might be stitched together and then cropped to produce a substitute for the missing middle frame data.
- some objects such as a moving car may appear stationary at least in areas within the field of view so multiple interpolation approaches within a single frame may be applied.
- interpolation opportunities were applied to a single camera view's frame sequence
- interpolation with reference to other camera view frame sequences may also be performed.
- an object moving in a frame of a first camera's frame sequence might have strong correlation with the same object a short time later captured in a frame of the second camera's frame sequence.
- the correlating frame of the second camera's frame sequence is discarded or replaced, at least the frame in the first camera's frame sequence can be used by a decoder to recreate the missing data.
- a single frame (or frame portion) alone or along with other frames (or frame portions) from either or both camera sequences can be used by the decoder to recreate the substitute.
- a decoder will conclude that interpolation is needed and respond by either repeating an adjacent frame (e.g., if the frames are substantially different) or create a middling alternative based on both preceding and subsequent frame data using a single camera's frame sequence. If the interpolation information contains only a marker, the decoder will immediately do the same as above without having to indirectly reach the conclusion that interpolation is needed.
- the interpolation information may also contain further items that either direct or assist a decoder in performing a desired interpolation.
- the interpolation information may also contain interpolation instructions, frame reference identifiers (that identify a frame or frames from which a decoder can base its interpolation), interpolation parameters (weighting factors, interpolation approaches to be used, regional/area definitions, etc.), filters (to be applied in the interpolation process) and any accompanying data (e.g., texture maps, etc.) that may enhance the interpolation process.
- interpolation instructions that identify a frame or frames from which a decoder can base its interpolation
- interpolation parameters weighting factors, interpolation approaches to be used, regional/area definitions, etc.
- filters to be applied in the interpolation process
- any accompanying data e.g., texture maps, etc.
- the encoder may choose to send interpolation information that identifies for use in the interpolation process one or more frames selected from other camera's frame sequences and other possibly non-adjacent frames from within the same camera's frame sequence.
- the interpolation information may also include the various interpolation parameters mentioned above, interpolation approaches to be used, regional definitions in which such approaches and frames are used, filters and data.
- a single encoder can perform all or any portion of the above in association with a full frame or sections thereof. For instance, a single frame can be broken down into regions and interpolation per region can be different from that of another region.
- FIGS. 4-7 are flow charts that illustrate several of many approaches for carrying out at least a portion of such encoder interpolation processing. More specifically, as shown in FIG. 4 , flowchart 400 begins step 402 . In step 402 , both a first portion of three-dimensional video content and a second portion of the three-dimensional video content are received. The first portion corresponds to data that represents at least one frame from a first sequence of frames that provide a first perspective view. The second portion corresponds to data that represents at least one frame from a second sequence of frames that provide a second perspective view. Although not shown, a third portion that corresponds to data that represents at least one other frame from either the first or the second sequences of frames could also be gathered and considered in the interpolation process. Of course, many other portions from various other frames can also be gathered and used.
- the processing circuitry 304 receives all portions, including both the first portion and the second portion of the three-dimensional video content through the input circuitry 302 .
- the first portion and the second portion are encoded.
- the encoding involves at least in part analyzing the first portion and the second portion to identify an interpolation opportunity.
- the processing circuitry 304 encodes the first portion and the second portion.
- frame data is replaced with an interpolation marker.
- the processing circuitry 304 replaces the frame data with the interpolation marker.
- an encoded representation of the three-dimensional video content is delivered.
- the processing circuitry 304 delivers the encoded representation of the three-dimensional video content (e.g., encoded three-dimensional video content 312 ) through the output circuitry 306 .
- one or more of the steps 402 , 404 , 406 , and/or 408 of the flowchart 400 may not be performed. Moreover, other steps in addition to or in lieu of the steps 402 , 404 , 406 , and/or 408 may be performed.
- FIG. 5 shows a flowchart 500 that illustrates one of many possible implementations of the step 404 of the flowchart 400 in FIG. 4 in accordance with an embodiment of the present invention.
- flowchart 500 includes step 502 that may be applied in the step 404 of the flowchart 400 in FIG. 4 , for example.
- a current frame is compared with frames that neighbor the current frame to identify the interpolation opportunity.
- the frames that neighbor the current frame may be included in respective portions of the three-dimensional video content that correspond to respective reference perspective views.
- the current frame may be included in a portion of the three-dimensional video content that corresponds to a perspective view that is between the reference perspective views.
- the interpolation opportunity is identified in a first frame of the first portion while the neighboring frames include a second frame from the second portion.
- the processing circuitry 304 may compare the current frame with the frames that neighbor the current frame (neighbors within either or both of the current camera view frame sequence and other camera view's frame sequences) to identify the interpolation opportunity.
- step 404 of flowchart 400 may be performed in response to any one or more of the steps shown in flowcharts 600 , 700 , 800 , and/or 900 shown in FIGS. 6-9 .
- flowchart 600 includes step 602 .
- step 602 a determination is made that an accuracy of an estimate of the frame data is greater than a threshold accuracy.
- the processing circuitry 304 determines that the accuracy of the estimate is greater than the threshold accuracy.
- the processing circuitry 304 may perform an interpolation operation with respect to the first portion and/or the second portion to generate the estimate of the frame data.
- processing circuitry 304 may compare the estimate to the frame data to determine the accuracy of the estimate.
- processing circuitry 304 may compare the accuracy to the threshold accuracy to determine whether the estimate is greater than the threshold accuracy. For instance, processing circuitry may be configured to replace the frame data with an interpolation marker at step 406 if the estimate of the frame data is greater than the threshold accuracy, but not if the estimate is less than the threshold accuracy.
- a determination is made that a communication channel via which the three-dimensional video content is to be transmitted has at least one specified characteristic. For instance, a load that is associated with the communication channel may be greater than a threshold load.
- FIG. 10 depicts a block diagram of a decoding system 1000 , which is an exemplary implementation of interpolation-enabled decoding system 214 of FIG. 2 , in accordance with an embodiment.
- decoding system 1000 includes input circuitry 1002 , processing circuitry 1004 , and output circuitry 1006 .
- Input circuitry 1002 serves as an input interface for decoding system 1000 .
- Processing circuitry 1004 receives a plurality of encoded portions 1010 A- 1010 N of encoded three-dimensional video content 1008 through input circuitry 1002 .
- Each of the encoded portions 1010 A- 1010 N represents a respective sequence of frames that provides a respective perspective view of a video event.
- Processing circuitry 1004 decodes the encoded portions 1010 A- 1010 N to provide decoded portions 1014 A- 1014 M, which are included in decoded three-dimensional video content 1012 .
- the decoded three-dimensional video content 1012 is also referred to as a decoded representation of the encoded three-dimensional video content 1008 . It will be recognized that the number of encoded portions “N” need not necessarily be equal to the number of decoded portions “M”. For instance, processing circuitry 1004 may interpolate between any the encoded portions 1010 A- 1010 N to generate one or more of the decoded portions 1014 A- 1014 M.
- processing circuitry 1004 responds to one or more interpolation markers by generating frame data to replace the respective interpolation marker(s). For instance, processing circuitry 1004 may respond to a first interpolation marker by generating first frame data to replace the first interpolation marker. Processing circuitry may respond to a second interpolation marker by generating second frame data to replace the second interpolation marker, and so on.
- the interpolation marker(s) are included in the encoded three-dimensional video content 1008 .
- the instance(s) of frame data that replace the respective interpolation marker(s) are included in the decoded three-dimensional video content 1012 .
- any one or more of the interpolation marker(s) may be accompanied by an interpolation instruction.
- processing system 1004 may use a first subset of the encoded portions 1010 A- 1010 N that is specified by a first interpolation instruction that corresponds to a first interpolation marker to generate first frame data to replace the first interpolation marker.
- processing system 1004 may use a second subset of the encoded portions 1010 A- 1010 N that is specified by a second interpolation instruction that corresponds to a second interpolation marker to generate second frame data to replace the second interpolation marker, and so on.
- Each interpolation instruction (or the interpolation marker that it accompanies) may specify a type of interpolation to be performed to generate the frame data that the interpolation marker replaces.
- processing circuitry 1004 identifies one or more frames that are not directly represented by one or more respective encoded portions of the encoded three-dimensional video content 1008 .
- a frame is not directly represented if the frame is replaced with an interpolation marker in the encoded three-dimensional video content 1008 .
- a frame is not directly represented if the frame is missing from the encoded three-dimensional video content 1008 .
- a frame is not directly represented if the frame is represented by erroneous data in the encoded three-dimensional video content. Missing frames and erroneous frame data may occur, for example, because of (i) defects in storage media or storage process, and (ii) losses or unacceptable delays encountered in a less than perfect communication pathway.
- Another example resulting in a need for interpolation occurs when referenced frame data cannot be found or is itself erroneous (corrupted). That is, current frame data is correct but to decode it, one or more other portions of frame data (such portions directly associated with different frames) happen to be missing or contain erroneous data. In such case and without an ability to decode the present, correct frame data, interpolation may be performed to generate the current frame as an alternative.
- Processing circuitry 1004 produces interpolation(s) of the respective frame(s) that are not directly represented.
- Output circuitry 1006 serves as an output interface for decoding system 1000 .
- Processing circuitry 1004 delivers the decoded three-dimensional video content 1012 through output circuitry 1006 .
- FIGS. 11-16 show flowcharts 1100 , 1200 , 1300 , 1400 , 1500 , and 1600 of exemplary methods for decoding portions of encoded three-dimensional video content using interpolation according to embodiments.
- Flowcharts 400 , 500 , 600 , 700 , 800 , and 900 may be performed by decoding system 1000 shown in FIG. 10 , for example.
- the methods of flowcharts 1100 , 1200 , 1300 , 1400 , 1500 , and 1600 are not limited to that embodiment.
- the basic approach involves decoder processing of at least a first sequence of frames and a second sequence of frames, wherein the first sequence represents a first perspective view (e.g., a right eye view) while the second sequence represents a second perspective view (e.g., a left eye view).
- the decoder receives many frames that are encoded based on the frame itself (no referencing to other frames), internal referencing (referencing frames within the same sequence of frames), and external referencing (referencing frames outside of the current frame's sequence of frames).
- such encoded data is either (i) deleted or (ii) replaced with interpolation information.
- Interpolation information may be nothing more than an indicator or marker (an “interpolation marker”) but may also contain interpolation instructions, data and parameters.
- the decoder processing involves applying at least one but, depending on the embodiment, may apply multiple interpolation approaches (along with various underlying parameters variations).
- an encoder may replace the middle frame in the sequence with nothing at all.
- the decoder detects that the middle frame is missing and interpolates between the first frame sequence and the third frame sequence.
- the encoder may use a marker (an interpolation marker) instead of the second frame's encoded data.
- the decoder might either (i) substitute the first or the third frame data for the missing second frame data, which is likely to not be noticed by a viewer due to the relatively short frame rate period, (ii) create a substitute for the missing second frame data by creating an average between the first frame and the second frame (e.g., a 50/50 “weighted” addition), or (iii) otherwise create a substitute based on weighted addition percentage or using some other interpolation approach that may utilize interpolation parameters, filters and other data.
- the decoder may stitch a first and second frame sequence together and then crop the stitched sequence to produce a substitute for the missing middle frame data.
- some objects such as a moving car may appear stationary at least in areas within the field of view so multiple interpolation approaches within a single frame may be applied.
- interpolation with reference to other camera view frame sequences may also be performed.
- an object moving in a frame of a first camera's frame sequence might have strong correlation with the same object a short time later captured in a frame of the second camera's frame sequence.
- the correlating frame of the second camera's frame sequence is discarded or replaced, at least the frame in the first camera's frame sequence can be used by the decoder to recreate the missing data.
- a single frame (or frame portion) alone or along with other frames (or frame portions) from either or both camera sequences can be used by the decoder to recreate the substitute.
- the decoder will conclude that interpolation is needed and respond by either repeating an adjacent frame (e.g., if the frames are substantially different) or create a middling alternative based on both preceding and subsequent frame data using a single camera's frame sequence. If the interpolation information contains only a marker, the decoder will immediately do the same as above without having to indirectly reach the conclusion that interpolation is needed.
- the interpolation information may also contain further items that either direct or assist the decoder in performing a desired interpolation.
- the interpolation information may also contain interpolation instructions, frame reference identifiers (that identify a frame or frames from which the decoder can base its interpolation), interpolation parameters (weighting factors, interpolation approaches to be used, regional/area definitions, etc.), filters (to be applied in the interpolation process) and any accompanying data (e.g., texture maps, etc.) that may enhance the interpolation process.
- interpolation instructions that identify a frame or frames from which the decoder can base its interpolation
- interpolation parameters weighting factors, interpolation approaches to be used, regional/area definitions, etc.
- filters to be applied in the interpolation process
- any accompanying data e.g., texture maps, etc.
- an encoder may choose to send interpolation information that identifies for use by the decoder one or more frames selected from other camera's frame sequences and other possibly non-adjacent frames from within the same camera's frame sequence.
- the interpolation information may also include the various interpolation parameters mentioned above, interpolation approaches to be used, regional definitions in which such approaches and frames are used, filters and data.
- a decoder can perform all or any portion of the above in association with a full frame or sections thereof. For instance, the decoder can perform interpolation operations on respective regions of a single frame, and interpolation per region can be different from that of another region.
- FIGS. 11-16 are flow charts that illustrate several of many approaches for carrying out at least a portion of such decoder interpolation processing. More specifically, as shown in FIG. 11 , flowchart 1100 begins step 1102 . In step 1102 , both a first encoded portion of a first encoded sequence of frames that represent a first perspective view and a second encoded portion of a second encoded sequence of frames that represent a second perspective view are received. In the implementation example of FIG. 10 , the processing circuitry 1004 receives the first encoded portion and the second encoded portion through the input circuitry 1002 .
- the first encoded portion and the second encoded portion are decoded.
- the decoding involves responding to an interpolation marker by generating frame data to replace the interpolation marker.
- the processing circuitry 1004 decodes the first encoded portion and the second encoded portion.
- a decoded representation of the encoded three-dimensional video content is delivered.
- the processing circuitry 1004 delivers the decoded representation of the encoded three-dimensional video content (e.g., decoded three-dimensional video content 1012 ) through the output circuitry 1006 .
- one or more steps 1102 , 1104 , and/or 1106 of flowchart 1100 may not be performed. Moreover, steps in addition to or in lieu of steps 1102 , 1104 , and/or 1106 may be performed.
- flowchart 1200 begins at step 1202 .
- a determination is made that a number of perspective views that a display is capable of processing is greater than a number of perspective views that is initially represented by the encoded three-dimensional video content.
- the processing circuitry 1004 determines that the number of perspective views that the display is capable of processing is greater than the number of perspective views that is initially represented by the encoded three-dimensional video content.
- an interpolation request is provided to an encoder.
- the interpolation request requests inclusion of an interpolation marker in the encoded three-dimensional video content.
- the processing circuitry 1004 provides the interpolation request through the output circuitry 1006 .
- an interpolation is performed between a decoded version of the first encoded portion and a decoded version of the second encoded portion to generate frame data that corresponds to a third sequence of frames that represent a third perspective view to replace the interpolation marker.
- the third perspective view is not initially represented by the encoded three-dimensional video content.
- the processing circuitry 1004 interpolates between the decoded version of the first encoded portion and the decoded version of the second encoded portion to generate the frame data.
- step 1302 an interpolation instruction is received from an upstream device.
- the processing circuitry 1004 receives the interpolation instructions through the input circuitry 1002 .
- the first encoded portion and the second encoded portion are decoded.
- the decoding involves responding to an interpolation marker by generating frame data to replace the interpolation marker in accordance with the interpolation instruction.
- the processing circuitry 1004 decodes the first encoded portion and the second encoded portion.
- step 1402 the first encoded portion is decoded to provide a first decoded portion of a first decoded sequence of frames that represents the first perspective view.
- the processing circuitry 1004 decodes the first encoded portion.
- the second encoded portion is decoded to provide decoded data that represents the second perspective view, the decoded data including an interpolation marker.
- the processing circuitry 1004 decodes the second encoded portion.
- an interpolation is performed between the first decoded portion and a third decoded portion of a third decoded sequence of frames that represents a third perspective view to generate frame data to replace the interpolation marker in the decoded data.
- the processing circuitry 1004 interpolates between the first decoded portion and the third decoded portion to generate the frame data.
- flowchart 1500 begins at step 1502 .
- a weight indicator is received from an upstream device.
- the weight indicator specifies an extent to which the first decoded portion is to be weighed with respect to the third decoded portion.
- the processing circuitry 1004 receives the weight indicator from the upstream device through input circuitry 1002 .
- an interpolation is performed between the first decoded portion and a third decoded portion of a third decoded sequence of frames that represents a third perspective view to generate frame data to replace the interpolation marker in the decoded data based on the extent that is specified by the weight indicator.
- the processing circuitry 1004 interpolates between the first decoded portion and the third decoded portion to generate the frame data.
- flowchart 1600 begins at step 1602 .
- step 1602 at least a portion of first encoded data is retrieved that relates to a first sequence of frames representing a first perspective view.
- the processing circuitry 1004 retrieves the at least one portion of the first encoded data.
- At step 1604 at least a portion of second encoded data is received that relates to a second sequence of frames representing a second perspective view.
- the processing circuitry 1004 retrieves the at least one portion of the second encoded data.
- a first frame is identified within the first sequence of frames that is not directly represented by the first encoded data retrieved. For example, an interpolation marker that is associated with the first frame may be identified. In accordance with this example, the interpolation marker may be accompanied by interpolation instructions. In another example, the first frame includes a missing frame. In the implementation example of FIG. 10 , the processing circuitry 1004 identifies the first frame.
- an interpolation of the first frame is produced.
- the interpolation may be based at least in part on the second encoded data.
- production of the interpolation of the first frame may be based on at least the portion of the first encoded data and at least a portion of third encoded data that relates to a third sequence of frames representing a third perspective view based on a weight indicator.
- the weight indicator specifies an extent to which at least the portion of the first encoded data is to be weighed with respect to at least the portion of the third encoded data.
- the processing circuitry 1004 produces the interpolation of the first frame.
- FIGS. 17-20 illustrate exemplary interpolation techniques 1700 , 1800 , 1900 , and 2000 according to embodiments.
- Each of the interpolation techniques 1700 , 1800 , 1900 , and 2000 is described with reference to exemplary instances of video content.
- the instances of video content may be 2D video content, 3D2 video content, 3D4 video content, 3D8 video content, etc. It will be recognized that 2D video content represents one perspective view of a video event, 3D2 video content represents two perspective views of the video event, 3D4 represents four perspective views of the video event, 3D8 video content represents eight perspective views of the video content, and so on.
- technique 1700 is directed to staging 3D8 content from 2D up according to an embodiment.
- Technique 1700 will be described with reference to original 3D8 content 1702 , 2D content 1704 , 3D2 content 1706 , 3D4 content 1708 , and 3D8 content 1710 .
- the original content used to illustrate technique 1700 is 3D8 content, which includes eight video streams (labeled as 1-8) that represent respective views of a video event.
- a single stream of the original 3D8 content 1702 may be used to provide 2D content.
- stream 3 of the original 3D8 content 1702 is used to provide 2D content 1704 for illustrative purposes.
- Internal interframe compression referencing is used with respect to stream 3 of the original 3D8 content 1702 to generate stream 3 of the 2D content 1704 .
- no other streams that are included in the original 3D8 content 1702 are referenced to generate stream 3 of the 2D content 1704 .
- Internal interframe compression referencing is a technique in which differences between frames (e.g., adjacent frames) that are included in a stream of video content are used to represent the frames in that stream.
- a first frame in the stream may be designated as a reference frame with other frames in the stream being designated as dependent frames.
- the reference frame may be represented by data that is sufficient to independently define the reference frame
- the dependent frames may be represented by difference data.
- the difference data that represents each dependent frame may use data that represents one or more of the other frames in the stream to generate data that is sufficient to independently define that dependent frame.
- Two streams of the original 3D8 content 1702 may be used to provide 3D2 content.
- streams 3 and 7 of the original 3D8 content 1702 are used to provide 3D2 content 1706 for illustrative purposes.
- Internal interframe compression referencing is used with respect to stream 3 of the original 3D8 content 1702 to generate stream 3 of the 3D2 content 1706 , as described above with reference to 2D content 1704 .
- Stream 7 of the 3D2 content 1706 is generated using internal interframe compression referencing with respect to stream 7 of the original 3D8 content 1702 and stream 3 of the 2D content 1704 for referencing.
- streams of the original 3D8 content 1702 may be used to provide 3D4 content.
- streams 3 and 7 of the 3D4 content 1708 are generated as described above with reference to 3D2 content 1706 .
- Streams 1 and 5 of the 3D4 content 1708 are generated using streams 1 and 5 of the original 3D8 content 1702 and streams 3 and 7 of the 3D2 content 1706 for referencing.
- All eight streams of the original 3D8 content 1702 may be used to provide 3D8 content.
- streams 1 , 3 , 5 , and 7 of the 3D8 content 1710 are generated as described above with reference to 3D4 content 1708 .
- Streams 2 , 4 , 6 , and 8 of the 3D8 content 1710 are generated using any of a plurality of streams, which includes streams 1 , 3 , 5 , and 7 of the 3D4 content 1708 and stream 2 , 4 , 6 , and 8 of the original 3D8 content 1702 , for referencing.
- technique 1800 is directed to a limited referencing configuration according to an embodiment.
- Technique 1800 will be described with reference to original 3D8 content 1802 , 2D content 1804 , 3D2 content 1806 , 3D4 content 1808 , and 3D8 content 1810 .
- the original content used to illustrate technique 1800 is 3D8 content, which includes eight video streams (labeled as 1-8) that represent respective views of a video event.
- the 2D content 1804 and the 3D2 content 1806 are generated in the same manner as the 2D content 1704 and the 3D2 content 1706 described above with reference to FIG. 17 .
- the manner in which the 3D4 content 1808 and the 3D8 content 1810 are generated differs from the manner in which the 3D4 content 1708 and the 3D8 content 1710 are generated.
- streams 3 and 7 of the 3D4 content 1808 are generated as described above with reference to 3D4 content 1708 of FIG. 17 .
- stream 1 of the 3D4 content 1808 is generated using stream 1 of the original 3D8 content 1802 and streams 3 and 7 of the 3D2 content 1806 for referencing.
- stream 5 of the 3D4 content 1808 is generated using stream 5 of the original 3D8 content 1802 and streams 3 and 7 of the 3D2 content 1806 for referencing.
- Streams 1 , 3 , 5 , and 7 of the 3D8 content 1810 are generated as described above with reference to the 3D4 content 1808 .
- Streams 2 , 4 , 6 , and 8 of the 3D8 content 1810 are generated using internal interframe compression referencing and streams 1 , 3 , 5 , and 7 of the 3D4 content 1808 for referencing.
- technique 1900 is directed to interpolation of lost frame data to maintain image stability according to an embodiment.
- Technique 1900 will be described with reference to original 3D8 content 1902 , 2D content 1904 , and 3D2 content 1906 .
- the original content used to illustrate technique 1900 is 3D8 content, which includes eight video streams (labeled as 1-8) that represent respective views of a video event.
- the 2D content 1904 is generated in the same manner as the 2D content 1704 described above with reference to FIG. 17 .
- the manner in which the 3D2 content 1906 is generated differs from the manner in which the 3D2 content 1708 is generated.
- stream 3 of the 3D2 content 1906 is generated as described above with reference to 3D2 content 1706 of FIG. 17 .
- stream 7 of the 3D2 content 1906 is generated using stream 3 of the 2D content 1904 for referencing.
- Stream 7 of the 3D2 content 1906 is generated further using internal interframe compression referencing if a previous frame and/or a future frame of stream 3 of the 2D content 1904 is similar to a current frame of stream 3 of the 2D content 1904 .
- technique 2000 is directed to interpolation to provide a number of views that is greater than a number of views that are represented by received video content according to an embodiment.
- Technique 2000 will be described with reference to original 3D4 content 2002 , 2D content 2004 , 3D2 content 2006 , 3D4 content 2008 , and 3D8 content 2010 .
- the original content used to illustrate technique 2000 is 3D4 content, which includes four video streams (labeled as 1-4) that represent respective views of a video event.
- a single stream of the original 3D4 content 2002 may be used to provide 2D content. As shown in FIG. 20 , stream 3 of the original 3D4 content 2002 is used to provide 2D content 2004 for illustrative purposes. Internal interframe compression referencing is used with respect to stream 3 of the original 3D4 content 2002 to generate stream 3 of the 2D content 2004 . However, no other streams that are included in the original 3D4 content 2002 are referenced to generate stream 3 of the 2D content 2004 .
- Two streams of the original 3D4 content 2002 may be used to provide 3D2 content.
- streams 1 and 3 of the original 3D4 content 2002 are used to provide 3D2 content 2006 for illustrative purposes.
- Stream 3 of the 3D2 content 2006 is generated as described above with reference to the 2D content 2004 .
- Stream 1 of the 3D2 content 2006 is generated using internal interframe compression referencing with respect to stream 1 of the original 3D4 content 2002 and stream 3 of the 2D content 2004 for referencing.
- All four streams of the original 3D4 content 2002 may be used to provide 3D4 content.
- streams 1 and 3 of the 3D4 content 2008 are generated as described above with reference to 3D2 content 2006 .
- Streams 2 and 4 of the 3D4 content 2008 are generated using streams 2 and 4 of the original 3D4 content 2002 and streams 1 and 3 of the 3D2 content 2006 for referencing.
- All four streams of the original 3D4 content 2002 may be used to provide 3D8 content.
- streams 1 - 4 of the 3D8 content 1710 are generated as described above with reference to 3D4 content 2008 .
- Streams 5 - 8 of the 3D8 content 2010 are entirely interpolated using nearest neighbor streams.
- Streams that are used by a decoder for purposes of interpolation must be available to the decoder.
- external interpolation referencing is used to encode data, only frame sequences (perspective views) that are allowed to be referenced for encoding purposes may be used for interpolation purposes.
- External interpolation referencing involves referencing frames to be used for interpolation that can be found in frame sequences outside of a current frame sequence (i.e., from a different perspective view).
- streams are encoded using hierarchical encoding techniques, as described in commonly-owned, co-pending U.S. patent application Ser. No. ______ (Atty. Docket No. A05.01330000), filed on even date herewith and entitled “Hierarchical Video Compression Supporting Selective Delivery of Two-Dimensional and Three-Dimensional Video Content,” the entirety of which is incorporated by reference herein.
- Such embodiments enable a subset of a total number of streams to be received and decoded at the decoder, wherein some of the streams received may be decoded by referencing some of the other streams received.
- none of the received streams rely on non-received streams for decoding. Accordingly, in these embodiments, interpolation referencing is limited to received streams.
- Embodiments may be implemented in hardware, software, firmware, or any combination thereof.
- encoding system 208 , decoding system 214 , display circuitry 216 , input circuitry 302 , processing circuitry 304 , output circuitry 306 , input circuitry 1002 , processing circuitry 1004 , and/or output circuitry 1006 may be implemented as hardware logic/electrical circuitry.
- encoding system 208 , decoding system 214 , display circuitry 216 , input circuitry 302 , processing circuitry 304 , output circuitry 306 , input circuitry 1002 , processing circuitry 1004 , and/or output circuitry 1006 may be implemented as computer program code configured to be executed in one or more processors.
- FIG. 21 shows a block diagram of an exemplary implementation of electronic device 2100 according to an embodiment.
- electronic device 2100 may include one or more of the elements shown in FIG. 21 .
- electronic device 2100 may include one or more processors (also called central processing units, or CPUs), such as a processor 2104 .
- processors also called central processing units, or CPUs
- Processor 2104 is connected to a communication infrastructure 2102 , such as a communication bus.
- processor 2104 can simultaneously operate multiple computing threads.
- Electronic device 2100 also includes a primary or main memory 2106 , such as random access memory (RAM).
- Main memory 2106 has stored therein control logic 2128 A (computer software), and data.
- Electronic device 2100 also includes one or more secondary storage devices 2110 .
- Secondary storage devices 2110 include, for example, a hard disk drive 2112 and/or a removable storage device or drive 2114 , as well as other types of storage devices, such as memory cards and memory sticks.
- electronic device 2100 may include an industry standard interface, such a universal serial bus (USB) interface for interfacing with devices such as a memory stick.
- Removable storage drive 2114 represents a floppy disk drive, a magnetic tape drive, a compact disk drive, an optical storage device, tape backup, etc.
- Removable storage drive 2114 interacts with a removable storage unit 2116 .
- Removable storage unit 2116 includes a computer useable or readable storage medium 2124 having stored therein computer software 2128 B (control logic) and/or data.
- Removable storage unit 2116 represents a floppy disk, magnetic tape, compact disk, DVD, optical storage disk, or any other computer data storage device.
- Removable storage drive 2114 reads from and/or writes to removable storage unit 2116 in a well known manner.
- Electronic device 2100 further includes a communication or network interface 2118 .
- Communication interface 2118 enables the electronic device 2100 to communicate with remote devices.
- communication interface 2118 allows electronic device 2100 to communicate over communication networks or mediums 2142 (representing a form of a computer useable or readable medium), such as LANs, WANs, the Internet, etc.
- Network interface 2118 may interface with remote sites or networks via wired or wireless connections.
- Control logic 2128 C may be transmitted to and from electronic device 2100 via the communication medium 2122 .
- Any apparatus or manufacture comprising a computer useable or readable medium having control logic (software) stored therein is referred to herein as a computer program product or program storage device.
- Devices in which embodiments may be implemented may include storage, such as storage drives, memory devices, and further types of computer-readable media.
- Examples of such computer-readable storage media include a hard disk, a removable magnetic disk, a removable optical disk, flash memory cards, digital video disks, random access memories (RAMs), read only memories (ROM), and the like.
- computer program medium and “computer-readable medium” are used to generally refer to the hard disk associated with a hard disk drive, a removable magnetic disk, a removable optical disk (e.g., CDROMs, DVDs, etc.), zip disks, tapes, magnetic storage devices, MEMS (micro-electromechanical systems) storage, nanotechnology-based storage devices, as well as other media such as flash memory cards, digital video discs, RAM devices, ROM devices, and the like.
- Such computer-readable storage media may store program modules that include computer program logic for encoding system 208 , decoding system 214 , display circuitry 216 , input circuitry 302 , processing circuitry 304 , output circuitry 306 , input circuitry 1002 , processing circuitry 1004 , output circuitry 1006 , flowchart 400 , flowchart 500 , flowchart 600 , flowchart 700 , flowchart 800 , flowchart 900 , flowchart 1100 , flowchart 1200 , flowchart 1300 , flowchart 1400 , flowchart 1500 , flowchart 1600 (including any one or more steps of flowcharts 400 , 500 , 600 , 700 , 800 , 900 , 1100 , 1200 , 1300 , 1400 , 1500 , and 1600 ), and/or further embodiments of the present invention described herein.
- Embodiments of the invention are directed to computer program products comprising such logic (e.g., in the form of program code or software) stored on any computer useable medium.
- Such program code when executed in one or more processors, causes a device to operate as described herein.
- the invention can be put into practice using software, firmware, and/or hardware implementations other than those described herein. Any software, firmware, and hardware implementations suitable for performing the functions described herein can be used.
- electronic device 2100 may be implemented in association with a variety of types of display devices.
- electronic device 2100 may be one of a variety of types of media devices, such as a stand-alone display (e.g., a television display such as flat panel display, etc.), a computer, a game console, a set top box, a digital video recorder (DVR), other electronic device mentioned elsewhere herein, etc.
- Media content that is delivered in two-dimensional or three-dimensional form according to embodiments described herein may be stored locally or received from remote locations. For instance, such media content may be locally stored for playback (replay TV, DVR), may be stored in removable memory (e.g.
- FIG. 21 shows a first media content 2130 A that is stored in hard disk drive 2112 , a second media content 2130 B that is stored in storage medium 2124 of removable storage unit 2116 , and a third media content 2130 C that may be remotely stored and received over communication medium 2122 by communication interface 2118 .
- Media content 2130 may be stored and/or received in these manners and/or in other ways.
Abstract
Description
- This application claims the benefit of U.S. Provisional Application No. 61/291,818, filed Dec. 31, 2009, which is incorporated by reference herein in its entirety. This application also claims the benefit of U.S. Provisional Application No. 61/303,119, filed Feb. 10, 2010, which is incorporated by reference herein in its entirety.
- This application is also related to the following U.S. Patent Applications, each of which also claims the benefit of U.S. Provisional Patent Application Nos. 61/291,818 and 61/303,119 and each of which is incorporated by reference herein:
- U.S. patent application Ser. No. 12/845,409, filed on Jul. 28, 2010, and entitled “Display with Adaptable Parallax Barrier”;
- U.S. patent application Ser. No. 12/845,440, filed on Jul. 28, 2010, and entitled “Adaptable Parallax Barrier Supporting Mixed 2D and Stereoscopic 3D Display Regions”;
- U.S. patent application Ser. No. 12/845,461, filed on Jul. 28, 2010, and entitled “Display Supporting Multiple Simultaneous 3D Views”; and
- U.S. patent application Ser. No. ______ (Attorney Docket No. A05.01330000), filed on even date herewith and entitled “Hierarchical Video Compression Supporting Selective Delivery of Two-Dimensional and Three-Dimensional Video Content.”
- 1. Field of the Invention
- The present invention relates to techniques for processing video images.
- 2. Background Art
- Images may be transmitted for display in various forms. For instance, television (TV) is a widely used telecommunication medium for transmitting and displaying images in monochromatic (“black and white”) or color form. Conventionally, images are provided in analog form and are displayed by display devices in the form of two-dimensional images. More recently, images are being provided in digital form for display in two-dimensions on display devices having improved resolution. Even more recently, images capable of being displayed in three-dimensions are being provided.
- Conventional displays may use a variety of techniques to achieve three-dimensional image viewing functionality. For example, various types of glasses have been developed that may be worn by users to view three-dimensional images displayed by a conventional display. Examples of such glasses include glasses that utilize color filters or polarized filters. In each case, the lenses of the glasses pass two-dimensional images of differing perspective to the user's left and right eyes. The images are combined in the visual center of the brain of the user to be perceived as a three-dimensional image. In another example, synchronized left eye, right eye LCD (liquid crystal display) shutter glasses may be used with conventional two-dimensional displays to create a three-dimensional viewing illusion. In still another example, LCD display glasses may be used to display three-dimensional images to a user. The lenses of the LCD display glasses include corresponding displays that provide images of differing perspective to the user's eyes, to be perceived by the user as three-dimensional.
- Some displays are configured for viewing three-dimensional images without the user having to wear special glasses, such as by using techniques of autostereoscopy. For example, a display may include a parallax barrier that has a layer of material with a series of precision slits. The parallax barrier is placed proximal to a display so that a user's eyes each see a different set of pixels to create a sense of depth through parallax. Another type of display for viewing three-dimensional images is one that includes a lenticular lens. A lenticular lens includes an array of magnifying lenses configured so that when viewed from slightly different angles, different images are magnified. Displays are being developed that use lenticular lenses to enable autostereoscopic images to be generated.
- Each technique for achieving three-dimensional image viewing functionality involves transmitting three-dimensional video content to a display device, so that the display device can display three-dimensional images that are represented by the three-dimensional video content to a user. A variety of issues may arise with respect to such transmission. For example, errors that occur during the transmission may cause frame data in the video content to become corrupted. In another example, a source of the video content and/or the channels through which the video content is transferred may become temporarily unable to handle a load that is imposed by the video content. In yet another example, the display device may be capable of processing frame data of a greater number of perspectives than the source is capable of providing.
- Methods, systems, and apparatuses are described for interpolating three-dimensional video content as shown in and/or described herein in connection with at least one of the figures, as set forth more completely in the claims.
- The accompanying drawings, which are incorporated herein and form part of the specification, illustrate embodiments of the present invention and, together with the description, further serve to explain the principles involved and to enable a person skilled in the relevant art(s) to make and use the disclosed technologies.
-
FIG. 1 is a block diagram of an exemplary system for generating three-dimensional video content that may be encoded in accordance with an embodiment. -
FIG. 2 is a block diagram of an exemplary display system according to an embodiment. -
FIG. 3 depicts an exemplary implementation of an encoding system shown inFIG. 2 in accordance with an embodiment. -
FIGS. 4-9 show flowcharts of exemplary methods for encoding portions of three-dimensional video content for subsequent interpolation according to embodiments. -
FIG. 10 depicts an exemplary implementation of a decoding system shown inFIG. 2 in accordance with an embodiment. -
FIGS. 11-16 show flowcharts of exemplary methods for decoding portions of encoded three-dimensional video content using interpolation according to embodiments. -
FIGS. 17-20 illustrate exemplary interpolation techniques according to embodiments. -
FIG. 21 is a block diagram of an exemplary electronic device according to an embodiment. - The features and advantages of the disclosed technologies will become more apparent from the detailed description set forth below when taken in conjunction with the drawings, in which like reference characters identify corresponding elements throughout. In the drawings, like reference numbers generally indicate identical, functionally similar, and/or structurally similar elements. The drawing in which an element first appears is indicated by the leftmost digit(s) in the corresponding reference number.
- I. Introduction
- The following detailed description refers to the accompanying drawings that illustrate exemplary embodiments of the present invention. However, the scope of the present invention is not limited to these embodiments, but is instead defined by the appended claims. Thus, embodiments beyond those shown in the accompanying drawings, such as modified versions of the illustrated embodiments, may nevertheless be encompassed by the present invention.
- References in the specification to “one embodiment,” “an embodiment,” “an example embodiment,” or the like, indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Furthermore, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the relevant art(s) to implement such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
- Furthermore, it should be understood that spatial descriptions (e.g., “above,” “below,” “up,” “left,” “right,” “down,” “top,” “bottom,” “vertical,” “horizontal,” etc.) used herein are for purposes of illustration only, and that practical implementations of the structures described herein can be spatially arranged in any orientation or manner.
- II. Example Embodiments
- Example embodiments relate to interpolation of three-dimensional video content. Three-dimensional video content is video content that includes portions representing respective frame sequences that provide respective perspective views of a given subject matter over the same period of time. In accordance with some embodiments, an upstream device analyzes the three-dimensional video content to identify one or more interpolation opportunities. An interpolation opportunity occurs when a target perspective view that is associated with the three-dimensional video content is between reference perspective views that are associated with the three-dimensional video content. The target perspective view and the reference perspective views are perspective views of a common video event that are provided by respective sequences of frames (alternatively referred to herein as “images” or “pictures”) that are represented by respective portions of the three-dimensional video content.
- For example, assume that three-dimensional video content includes portions PA, PB, and PC that represent respective perspective views VA, VB, and VC for illustrative purposes. Further assume that VB is between VA and VC. In accordance with this example, an interpolation opportunity is said to occur for providing an interpolated representation of PB based on PA and PC.
- If an interpolation opportunity is identified, frame data that is associated with the interpolation opportunity may be replaced with an interpolation marker. In accordance with the example mentioned above, the upstream device may replace PB with the interpolation marker.
- When a downstream device receives the three-dimensional video content, which includes the interpolation marker, the downstream device may replace the interpolation marker with an interpolated representation of the frame data that the interpolation marker replaced. For instance, the downstream device may interpolate between the portions of the three-dimensional video content that represent the sequences of frames that provide the reference perspective views to generate an interpolated representation (a.k.a. an interpolation) of the portion of the three-dimensional video content that represents the sequence of frames that provides the target perspective view. In accordance with the example mentioned above, the downstream device may interpolate between PA and PC to generate an interpolated representation of PB.
- In some embodiments, the downstream device identifies a frame that is not directly represented by data that is included in the three-dimensional video content. For example, the frame may be represented by an interpolation marker. However, in such embodiments, the downstream device may perform an interpolation operation with respect to portions of the three-dimensional video content even in the absence of an interpolation marker. For example, the data may be corrupted. In accordance with this example, the frame may be missing from the data, or a portion of the data that corresponds to the frame may include erroneous data. Accordingly, the interpolation need not necessarily be performed in response to an interpolation marker.
- The embodiments described herein have a variety of benefits as compared to conventional techniques for processing video content. For example, the embodiments may increase the likelihood that a source of the video content and/or the channels through which the video content is transferred are capable of handling a load that is imposed by the video content. In another example, the embodiments may be capable of increasing the number of perspectives that are provided by the video content. In yet another example, the embodiments may be capable of correcting corrupted data that is included in the video content based on other data in the video content. For instance, the corrupted data may be corrected on the fly using one or more of the techniques described herein.
- The following subsections describe a variety of example embodiments of the present invention. It will be apparent to persons skilled in the relevant art that various changes in form and detail can be made to the embodiments described herein without departing from the spirit and scope of the invention. Thus, the breadth and scope of the present invention should not be limited by any of the example embodiments described herein.
- A. Example Display System and Method Embodiments
- In accordance with embodiments described herein, three-dimensional video content is represented as a plurality of separate portions (a.k.a. digital video streams). Each portion represents a respective frame sequence that provides a respective perspective view of a video event. This is illustrated by
FIG. 1 , which is a diagram of anexemplary system 100 for generating three-dimensional video content that may be encoded in accordance with an embodiment. As shown inFIG. 1 ,system 100 includes a plurality ofvideo cameras 102A-102N that are directed at and operate to record images of thesame subject matter 104 from different perspectives over the same period of time. This results in the generation three-dimensional video content 106, which includes Ndifferent portions 108A-108N that provide different perspective views ofsubject matter 104 over the same period of time. - Of course, techniques other than utilizing video cameras may be used to produce the
different portions 108A-108N. For example, one or more of theportions 108A-108N may be created in a manual or automated fashion by digital animators using advanced graphics and animation tools. Additionally, at least one of theportions 108A-108N may be created by using a manual or automated interpolation process that creates a portion based on analysis of at least two of the other portions. For example, with reference toFIG. 1 , ifcamera 102B were absent, a digital video stream corresponding to the perspective view ofsubject matter 104 provided by that camera could nevertheless be created by performing an interpolation process on the portions of the three-dimensional video content 106 produced bycamera 102A and another of the cameras. Still other techniques not described herein may be used to produce one or more of the different digital video streams. - Display systems have been described that can display a single image of certain subject matter to provide a two-dimensional view thereof and that can also display two images of the same subject matter viewed from different perspectives in an integrated manner to provide a three-dimensional view thereof. Such two-dimensional (2D)/three-dimensional (3D) display systems can further display a multiple of two images (e.g., four images, eight images, etc.) of the same subject matter viewed from different perspectives in an integrated manner to simultaneously provide multiple three-dimensional views thereof, wherein the particular three-dimensional view perceived by a viewer is determined based at least in part on the position of the viewer. Examples of such 2D/3D display systems are described in the following commonly-owned, co-pending U.S. Patent Applications: U.S. patent application Ser. No. 12/845,409, filed on Jul. 28, 2010, and entitled “Display with Adaptable Parallax Barrier”; U.S. patent application Ser. No. 12/845,440, filed on Jul. 28, 2010, and entitled “Adaptable Parallax Barrier Supporting Mixed 2D and
Stereoscopic 3D Display Regions”; and U.S. patent application Ser. No. 12/845,461, filed on Jul. 28, 2010, and entitled “Display Supporting Multiple Simultaneous 3D Views.” The entirety of each of these applications is incorporated by reference herein. - The
portions 108A-108N produced bysystem 100 can be obtained and provided to a 2D/3D display system as described above in order to facilitate the presentation of a two-dimensional view ofsubject matter 104, a single three-dimensional view ofsubject matter 104, or multiple three-dimensional views ofsubject matter 104. -
FIG. 2 is a block diagram of anexemplary display system 200 according to an embodiment. Generally speaking,display system 200 operates to transmit three-dimensional video content, such as three-dimensional video content 106 ofFIG. 1 , to a display device, so that the display device can display three-dimensional images that are represented by the three-dimensional video content to user(s). According to embodiments,display system 200 interpolates between portions of the three-dimensional video content that correspond to respective perspective views to provide frame data that corresponds to another perspective view. As shown inFIG. 2 ,display system 200 includes source(s) 202 and adisplay device 204. Source(s) provide three-dimensional video content 206. Source(s) 202 can include any number of sources, including one, two, three, etc. Each source provides one or more portions of the three-dimensional video content 206. Examples of a source include but are not limited to a computer storage disc (e.g., a digital video disc (DVD) or a Blu-Ray® disc), local storage on a display device, a remote server (i.e., a server that is located remotely from the display device), a gaming system, a satellite, a cable headend, and a point-to-point system. - Some of the portions of the three-
dimensional video content 206 may serve as reference portions, while others serve as supplemental portions, though the scope of the embodiments is not limited in this respect. For instance, the supplemental portions may be used to increase the number of perspective views that are included in the three-dimensional video content beyond the number of perspective views that are represented by the reference portions. The reference portions may include 2D data, 3D2 data, 3D4 data, 3D8 data, etc. Supplemental portions may include auto-interpolated 2D-3D2 (single stream) data, manually generated interpolation 3D2 data, A-I 3D4 (3 stream) data, M-G-I 3D4 (3 stream) data, etc. - As shown in
FIG. 2 , source(s) 202 includes anencoding system 208.Encoding system 208 encodes the three-dimensional video content 206 to provide encoded three-dimensional video content 210. For example,encoding system 208 may replace frame data in the three-dimensional video content 206 with an interpolation marker. The interpolation marker may indicate that interpolation is to be performed between portions of the three-dimensional video content in order to generate an interpolated representation of the frame data that is replaced with the interpolation marker. The interpolation marker may be accompanied by instructions for generating the interpolated representation. It will be recognized, however, thatencoding system 208 need not necessarily replace frame data in the three-dimensional video content 206 with an interpolation marker. Regardless,encoding system 208 transmits the encoded three-dimensional video content 210 towarddisplay device 204 viacommunication channels 212. - It will be further recognized that source(s) 202 need not necessarily include
encoding system 208. For example, source(s) 202 may store the encoded three-dimensional video content 210, rather than generating the encoded three-dimensional video content 210 based on the three-dimensional video content 206. -
Communication channels 212 may include one or more local device pathways, point-to-point links, and/or pathways in a hybrid fiber coaxial (HFC) network, a wide-area network (e.g., the Internet), a local area network (LAN), another type of network, or a combination thereof.Communication channels 212 may support wired, wireless, or both transmission media, including satellite, terrestrial (e.g., fiber optic, copper, twisted pair, coaxial, or the like), radio, microwave, free-space optics, and/or any other form or method of transmission. -
Display device 204 displays images to user(s) upon receipt of the encoded three-dimensional video content 210.Display device 204 may be implemented in various ways. For instance,display device 204 may be a television display (e.g., a liquid crystal display (LCD) television, a plasma television, etc.), a computer monitor, a projection system, or any other type of display device. -
Display device 204 includes an interpolation-enableddecoding system 214,display circuitry 216, and ascreen 218.Decoding system 214 decodes the encoded three-dimensional video content 210 to provide decoded three-dimensional video content 220. For instance,decoding system 214 may interpolate between portions of a decoded representation of the encoded three-dimensional video content 210 to generate one or more of the portions of the decoded three-dimensional video content 220. In one example,decoding system 214 may interpolate in response to detecting an interpolation indicator in the encoded three-dimensional video content 210. In another example,decoding system 214 may interpolate in response to determining that a frame that is included in the decoded representation of the encoded three-dimensional video content 210 is not directly represented by data in the decoded representation. For instance,decoding system 214 may determine that the frame is replaced by an interpolation marker, that the frame is missing from the data, or that a portion of the data that corresponds to the frame includes erroneous data. Interpolation that is performed by decodingsystem 214 may be incorporated into a decoding process or may be performed after such a decoding process on raw data. - In an embodiment,
decoding system 214 maintains synchronization of the portions that are included in the decoded three-dimensional video content 220. For instance, such synchronization may be maintained during inter-reference frame periods, during screen reconfiguration, etc. Ifdecoding system 214 is unable to maintain synchronization with respect to one or more portions of the decoded three-dimensional video content 220,decoding system 214 may perform interpolation to generate interpolated representations of those portion(s) until synchronization is re-established.Decoding system 214 may synchronize 3DN adjustments with reference frame occurrence, where N can be any positive integer greater than or equal to two. A 3DN adjustment may include the addition of frame data corresponding to a perspective view, for example. For each additional perspective that is represented by the decoded three-dimensional video content 220, N is incremented by one. -
Display circuitry 216 directs display of one or more of the frame sequences that are represented by the decoded three-dimensional video content 220 towardscreen 218, as indicated byarrow 222, for presentation to the user(s). It will be recognized that althoughdisplay circuitry 216 is labeled as such, the functionality ofdisplay circuitry 216 may be implemented in hardware, software, firmware, or any combination thereof. -
Screen 218 displays the frame sequence(s) that are received fromdisplay circuitry 216 to the user(s).Screen 218 may be any suitable type of screen, including but not limited to an LCD screen, a plasma screen, a light emitting device (LED) screen (e.g., an OLED (organic LED) screen), etc. - It will be recognized that
encoding system 208 may be external to source(s) 202. Moreover,decoding system 214 may be external to displaydevice 204. For instance,encoding system 208 anddecoding system 214 may be implemented in a common device, such as a transcoder that is coupled between source(s) 202 anddisplay device 204. - It will be further recognized that feedback may be provided from
communication channels 212 and/ordisplay device 204 to any one or more of the source(s) 202. For example,display device 204 may provide feedback to indicate an error that occurs with respect to frame data that is included in encoded three-dimensional video content 210, one or more characteristics that are associated withdisplay device 204, etc. Examples of such characteristics include but are not limited to a load that is associated withdisplay device 204 and a number of perspective views that displaydevice 204 is capable of processing. In another example,channels 212 may provide feedback to indicate an error that occurs with respect to frame data that is included in encoded three-dimensional video content 210, one or more characteristics (e.g., a load) that are associated with thechannels 212, etc. - B. Example Encoding Embodiments
-
FIG. 3 depicts a block diagram of anencoding system 300, which is an exemplary implementation ofencoding system 208 ofFIG. 2 , in accordance with an embodiment. As shown inFIG. 3 ,encoding system 300 includesinput circuitry 302,processing circuitry 304, andoutput circuitry 306.Input circuitry 302 serves as an input interface forencoding system 300.Processing circuitry 304 receives a plurality ofportions 310A-310N of three-dimensional video content 308 throughinput circuitry 302. Each of theportions 310A-310N represents a respective sequence of frames that provides a respective perspective view of a video event.Processing circuitry 304 encodes theportions 310A-310N to provide encodedportions 314A-314N. -
Processing circuitry 304 analyzes at least some of theportions 310A-310N to identify one or more interpolation opportunities. An interpolation opportunity occurs when a target perspective view that is associated with the three-dimensional video content 308 is between reference perspective views that are associated with the three-dimensional video content 308. The target perspective view and the reference perspective views are provided by respective sequences of frames that are represented by respective portions of the three-dimensional video content 308. For each identified interpolation opportunity, processingcircuitry 304 replaces frame data that is included in the corresponding portion of the three-dimensional video content 308 with an interpolation marker. For example, if processingcircuitry 304 identifies an interpolation opportunity in each offirst portion 310A andsecond portion 310B, processing circuitry replaces frame data that is included infirst portion 310A with an interpolation marker and replaces frame data that is included insecond portion 310B with another interpolation marker. - Any one or more of the interpolation marker(s) may be accompanied by an interpolation instruction. For instance, a first interpolation instruction that corresponds to a first interpolation marker may specify which of the
portions 310A-310N of the three-dimensional video content 308 are to be used for generating an interpolated representation of the frame data that the first interpolation marker replaces. A second interpolation instruction that corresponds to a second interpolation marker may specify which of theportions 310A-310N are to be used for generating an interpolated representation of the frame data that the second interpolation marker replaces, and so on. - Each interpolation marker may specify a type of interpolation to be performed to generate an interpolated representation of the frame data that the interpolation marker replaces. For instance, a first type of interpolation may assign a first weight to a first reference portion of the three-
dimensional video content 308 and a second weight that is different from the first weight to a second reference portion of the three-dimensional video content 308 for generating an interpolated representation of frame data. A second type of interpolation may assign equal weights to the first and second reference portions of the three-dimensional video content 308. Other exemplary types of interpolation include but are not limited to linear interpolation, polynomial interpolation, and spline interpolation. -
Output circuitry 306 serves as an output interface forencoding system 300.Processing circuitry 304 delivers encoded three-dimensional video content 312 that includes encodedportions 314A-314N throughoutput circuitry 306. - Portions of three-dimensional video content, such as
portions 310A-310N, may be encoded in any of a variety of ways.FIGS. 4-9 show flowcharts Flowcharts system 300 shown inFIG. 3 , for example. However the methods offlowcharts discussion regarding flowcharts Flowcharts - In all of
FIGS. 4-9 , the basic approach involves encoder processing of at least a first sequence of frames and a second sequence of frames, wherein the first sequence represents a first perspective view (e.g., a right eye view) while the second sequence represents a second perspective view (e.g., a left eye view). As an output of such encoder processing, many frames will be encoded based on the frame itself (no referencing to other frames), internal referencing (referencing frames within the same sequence of frames), and external referencing (referencing frames outside of the current frame's sequence of frames). In addition, whenever an interpolation opportunity presents itself, instead of sending encoded data for such frame, such encoded data will either be (i) merely deleted (forcing a decoder to perform interpolation based on its determination that the encoded frame data is missing), or (ii) replaced with interpolation information. Such interpolation information may be nothing more than an indicator or marker (an “interpolation marker”) but may also contain interpolation instructions, data and parameters. - In the encoder processing, a determination is made as to the hierarchical importance of the current frame under consideration. That is to determine the extent that the current frame will be referenced by other frames. For example, if the current frame is a primary reference frame (e.g., an I-Frame) that will be referenced by many other frames, applying interpolation may not be justifiable. If, on the other hand, the current frame will be referenced by no (or few) other frame(s), it may be a prime candidate for considering interpolation. In addition, a current frame is encoded to determine the size of the resultant encoded frame data. If the size is less than an established threshold, interpolation may not be applied. But, for example, if the current frame offers a justifiable data savings and without being referenced, it is a prime candidate for considering interpolation.
- Once a candidate frame has been identified, the encoder processing involves applying at least one but, depending on the embodiment, may apply multiple interpolation approaches (along with various underlying parameters variations). If only one approach is applied, a determination is made as to whether such interpolation can be used to yield a visually acceptable output. When multiple approaches are available, a selection therefrom of (i) a best match which is also determined to be visually acceptable, (ii) the first match that can be used to yield something visually acceptable, (iv) an acceptable match selected at least in part based on the ease of decoding and/or the size of the interpolation information, or etc. If best and/or acceptable interpolation information is identified which saves justifiable amounts of data, the encoder processing involves selecting to use the interpolation information (or use nothing to force default interpolation by a decoder) instead of the encoded frame data in subsequent storage and/or transmissions.
- For example, in a three frame sequence, a camera is fixed and only a relatively small object with the field of view moves relatively slowly therein, while the background remains practically unchanged. An interpolation opportunity might involve replacing the middle frame in the sequence with nothing at all, to force the decoder to interpolate between the first frame sequence and the third frame sequence. Alternatively, a marker (an interpolation marker) might be used instead of the second frame's encoded data. Upon identification of such marker, a decoder might either (i) substitute the first or the third frame data for the missing second frame data, which is likely to not be noticed by a viewer due to the relatively short frame rate period, (ii) create a substitute for the missing second frame data by creating an average between the first frame and the second frame (e.g., a 50/50 “weighted” addition), or (iii) otherwise create a substitute based on weighted addition percentage or using some other interpolation approach that may utilize interpolation parameters, filters and other data.
- In another example, when a camera is panning, to interpolate a missing middle frame, a first and second frame sequence might be stitched together and then cropped to produce a substitute for the missing middle frame data. Of course in a panning scene, some objects such as a moving car may appear stationary at least in areas within the field of view so multiple interpolation approaches within a single frame may be applied.
- Likewise, although the above two examples of interpolation opportunities were applied to a single camera view's frame sequence, interpolation with reference to other camera view frame sequences may also be performed. For example, an object moving in a frame of a first camera's frame sequence might have strong correlation with the same object a short time later captured in a frame of the second camera's frame sequence. Thus, if the correlating frame of the second camera's frame sequence is discarded or replaced, at least the frame in the first camera's frame sequence can be used by a decoder to recreate the missing data. In addition, a single frame (or frame portion) alone or along with other frames (or frame portions) from either or both camera sequences can be used by the decoder to recreate the substitute.
- Thus, by sending no interpolation information (i.e., no replacement for deleted frame data), a decoder will conclude that interpolation is needed and respond by either repeating an adjacent frame (e.g., if the frames are substantially different) or create a middling alternative based on both preceding and subsequent frame data using a single camera's frame sequence. If the interpolation information contains only a marker, the decoder will immediately do the same as above without having to indirectly reach the conclusion that interpolation is needed. The interpolation information may also contain further items that either direct or assist a decoder in performing a desired interpolation. That is, for example, the interpolation information may also contain interpolation instructions, frame reference identifiers (that identify a frame or frames from which a decoder can base its interpolation), interpolation parameters (weighting factors, interpolation approaches to be used, regional/area definitions, etc.), filters (to be applied in the interpolation process) and any accompanying data (e.g., texture maps, etc.) that may enhance the interpolation process.
- For example, images captured by one camera might be very close to those created at a brief time later by another camera. Thus, instead of using merely adjacent reference frames for interpolation (such as the three frame sequences with a missing middle frame approach mentioned above), the encoder may choose to send interpolation information that identifies for use in the interpolation process one or more frames selected from other camera's frame sequences and other possibly non-adjacent frames from within the same camera's frame sequence. The interpolation information may also include the various interpolation parameters mentioned above, interpolation approaches to be used, regional definitions in which such approaches and frames are used, filters and data.
- A single encoder can perform all or any portion of the above in association with a full frame or sections thereof. For instance, a single frame can be broken down into regions and interpolation per region can be different from that of another region.
-
FIGS. 4-7 are flow charts that illustrate several of many approaches for carrying out at least a portion of such encoder interpolation processing. More specifically, as shown inFIG. 4 ,flowchart 400 beginsstep 402. Instep 402, both a first portion of three-dimensional video content and a second portion of the three-dimensional video content are received. The first portion corresponds to data that represents at least one frame from a first sequence of frames that provide a first perspective view. The second portion corresponds to data that represents at least one frame from a second sequence of frames that provide a second perspective view. Although not shown, a third portion that corresponds to data that represents at least one other frame from either the first or the second sequences of frames could also be gathered and considered in the interpolation process. Of course, many other portions from various other frames can also be gathered and used. - In the implementation example of
FIG. 3 , theprocessing circuitry 304 receives all portions, including both the first portion and the second portion of the three-dimensional video content through theinput circuitry 302. - At
step 404, the first portion and the second portion are encoded. The encoding involves at least in part analyzing the first portion and the second portion to identify an interpolation opportunity. In the implementation example ofFIG. 3 , theprocessing circuitry 304 encodes the first portion and the second portion. - At
step 406, frame data is replaced with an interpolation marker. In the implementation example ofFIG. 3 , theprocessing circuitry 304 replaces the frame data with the interpolation marker. - At
step 408, an encoded representation of the three-dimensional video content is delivered. In the implementation example ofFIG. 3 , theprocessing circuitry 304 delivers the encoded representation of the three-dimensional video content (e.g., encoded three-dimensional video content 312) through theoutput circuitry 306. - In some embodiments, one or more of the
steps flowchart 400 may not be performed. Moreover, other steps in addition to or in lieu of thesteps -
FIG. 5 shows aflowchart 500 that illustrates one of many possible implementations of thestep 404 of theflowchart 400 inFIG. 4 in accordance with an embodiment of the present invention. Similarly, as shown inFIG. 5 ,flowchart 500 includesstep 502 that may be applied in thestep 404 of theflowchart 400 inFIG. 4 , for example. Instep 502, a current frame is compared with frames that neighbor the current frame to identify the interpolation opportunity. For example, the frames that neighbor the current frame may be included in respective portions of the three-dimensional video content that correspond to respective reference perspective views. In accordance with this example, the current frame may be included in a portion of the three-dimensional video content that corresponds to a perspective view that is between the reference perspective views. In an embodiment, the interpolation opportunity is identified in a first frame of the first portion while the neighboring frames include a second frame from the second portion. In the implementation example ofFIG. 3 , theprocessing circuitry 304 may compare the current frame with the frames that neighbor the current frame (neighbors within either or both of the current camera view frame sequence and other camera view's frame sequences) to identify the interpolation opportunity. - In some embodiments, step 404 of
flowchart 400 may be performed in response to any one or more of the steps shown inflowcharts FIGS. 6-9 . As shown inFIG. 6 ,flowchart 600 includesstep 602. Instep 602, a determination is made that an accuracy of an estimate of the frame data is greater than a threshold accuracy. In the implementation example ofFIG. 3 , theprocessing circuitry 304 determines that the accuracy of the estimate is greater than the threshold accuracy. For example, theprocessing circuitry 304 may perform an interpolation operation with respect to the first portion and/or the second portion to generate the estimate of the frame data. In accordance with this example,processing circuitry 304 may compare the estimate to the frame data to determine the accuracy of the estimate.Processing circuitry 304 may compare the accuracy to the threshold accuracy to determine whether the estimate is greater than the threshold accuracy. For instance, processing circuitry may be configured to replace the frame data with an interpolation marker atstep 406 if the estimate of the frame data is greater than the threshold accuracy, but not if the estimate is less than the threshold accuracy. - As shown in
FIG. 7 , a determination is made that an error occurs with respect to the frame data. For instance, it may be desirable to avoid sending frame data with respect to which an error is known to have occurred. - As shown in
FIG. 8 , a determination is made that a source that generates the three-dimensional video content has at least one specified characteristic. For example, a load that is associated with the source may be greater than a threshold load. In another example, the source may not support a viewing format that is associated with the frame data. - As shown in
FIG. 9 , a determination is made that a communication channel via which the three-dimensional video content is to be transmitted has at least one specified characteristic. For instance, a load that is associated with the communication channel may be greater than a threshold load. - C. Example Decoding Embodiments
-
FIG. 10 depicts a block diagram of adecoding system 1000, which is an exemplary implementation of interpolation-enableddecoding system 214 ofFIG. 2 , in accordance with an embodiment. As shown inFIG. 10 ,decoding system 1000 includesinput circuitry 1002,processing circuitry 1004, andoutput circuitry 1006.Input circuitry 1002 serves as an input interface fordecoding system 1000.Processing circuitry 1004 receives a plurality of encodedportions 1010A-1010N of encoded three-dimensional video content 1008 throughinput circuitry 1002. Each of the encodedportions 1010A-1010N represents a respective sequence of frames that provides a respective perspective view of a video event.Processing circuitry 1004 decodes the encodedportions 1010A-1010N to provide decodedportions 1014A-1014M, which are included in decoded three-dimensional video content 1012. The decoded three-dimensional video content 1012 is also referred to as a decoded representation of the encoded three-dimensional video content 1008. It will be recognized that the number of encoded portions “N” need not necessarily be equal to the number of decoded portions “M”. For instance,processing circuitry 1004 may interpolate between any the encodedportions 1010A-1010N to generate one or more of the decodedportions 1014A-1014M. - In some embodiments,
processing circuitry 1004 responds to one or more interpolation markers by generating frame data to replace the respective interpolation marker(s). For instance,processing circuitry 1004 may respond to a first interpolation marker by generating first frame data to replace the first interpolation marker. Processing circuitry may respond to a second interpolation marker by generating second frame data to replace the second interpolation marker, and so on. The interpolation marker(s) are included in the encoded three-dimensional video content 1008. The instance(s) of frame data that replace the respective interpolation marker(s) are included in the decoded three-dimensional video content 1012. - Any one or more of the interpolation marker(s) may be accompanied by an interpolation instruction. For instance,
processing system 1004 may use a first subset of the encodedportions 1010A-1010N that is specified by a first interpolation instruction that corresponds to a first interpolation marker to generate first frame data to replace the first interpolation marker.Processing system 1004 may use a second subset of the encodedportions 1010A-1010N that is specified by a second interpolation instruction that corresponds to a second interpolation marker to generate second frame data to replace the second interpolation marker, and so on. Each interpolation instruction (or the interpolation marker that it accompanies) may specify a type of interpolation to be performed to generate the frame data that the interpolation marker replaces. - In other embodiments,
processing circuitry 1004 identifies one or more frames that are not directly represented by one or more respective encoded portions of the encoded three-dimensional video content 1008. For example, a frame is not directly represented if the frame is replaced with an interpolation marker in the encoded three-dimensional video content 1008. In another example, a frame is not directly represented if the frame is missing from the encoded three-dimensional video content 1008. In yet another example, a frame is not directly represented if the frame is represented by erroneous data in the encoded three-dimensional video content. Missing frames and erroneous frame data may occur, for example, because of (i) defects in storage media or storage process, and (ii) losses or unacceptable delays encountered in a less than perfect communication pathway. Another example resulting in a need for interpolation occurs when referenced frame data cannot be found or is itself erroneous (corrupted). That is, current frame data is correct but to decode it, one or more other portions of frame data (such portions directly associated with different frames) happen to be missing or contain erroneous data. In such case and without an ability to decode the present, correct frame data, interpolation may be performed to generate the current frame as an alternative.Processing circuitry 1004 produces interpolation(s) of the respective frame(s) that are not directly represented. -
Output circuitry 1006 serves as an output interface fordecoding system 1000.Processing circuitry 1004 delivers the decoded three-dimensional video content 1012 throughoutput circuitry 1006. - Portions of encoded three-dimensional video content, such as encoded
portions 1010A-1010N, may be decoded in any of a variety of ways.FIGS. 11-16 show flowcharts Flowcharts decoding system 1000 shown inFIG. 10 , for example. However the methods offlowcharts discussion regarding flowcharts Flowcharts - In all of
FIGS. 11-16 , the basic approach involves decoder processing of at least a first sequence of frames and a second sequence of frames, wherein the first sequence represents a first perspective view (e.g., a right eye view) while the second sequence represents a second perspective view (e.g., a left eye view). The decoder receives many frames that are encoded based on the frame itself (no referencing to other frames), internal referencing (referencing frames within the same sequence of frames), and external referencing (referencing frames outside of the current frame's sequence of frames). In addition, instead of receiving encoded data for such frame, such encoded data is either (i) deleted or (ii) replaced with interpolation information. Upon determining that encoded data is deleted or replaced with interpolation information, the decoder performs interpolation to generate the encoded data. Interpolation information may be nothing more than an indicator or marker (an “interpolation marker”) but may also contain interpolation instructions, data and parameters. The decoder processing involves applying at least one but, depending on the embodiment, may apply multiple interpolation approaches (along with various underlying parameters variations). - For example, in a three frame sequence, a camera is fixed and only a relatively small object with the field of view moves relatively slowly therein, while the background remains practically unchanged. In accordance with this example, an encoder may replace the middle frame in the sequence with nothing at all. The decoder detects that the middle frame is missing and interpolates between the first frame sequence and the third frame sequence. Alternatively, the encoder may use a marker (an interpolation marker) instead of the second frame's encoded data. Upon identification of such marker, the decoder might either (i) substitute the first or the third frame data for the missing second frame data, which is likely to not be noticed by a viewer due to the relatively short frame rate period, (ii) create a substitute for the missing second frame data by creating an average between the first frame and the second frame (e.g., a 50/50 “weighted” addition), or (iii) otherwise create a substitute based on weighted addition percentage or using some other interpolation approach that may utilize interpolation parameters, filters and other data.
- In another example, when a camera is panning, to interpolate a missing middle frame, the decoder may stitch a first and second frame sequence together and then crop the stitched sequence to produce a substitute for the missing middle frame data. Of course in a panning scene, some objects such as a moving car may appear stationary at least in areas within the field of view so multiple interpolation approaches within a single frame may be applied.
- Likewise, although the above two interpolation examples were applied to a single camera view's frame sequence, interpolation with reference to other camera view frame sequences may also be performed. For example, an object moving in a frame of a first camera's frame sequence might have strong correlation with the same object a short time later captured in a frame of the second camera's frame sequence. Thus, if the correlating frame of the second camera's frame sequence is discarded or replaced, at least the frame in the first camera's frame sequence can be used by the decoder to recreate the missing data. In addition, a single frame (or frame portion) alone or along with other frames (or frame portions) from either or both camera sequences can be used by the decoder to recreate the substitute.
- Thus, if frame data is missing and no interpolation information (i.e., no replacement for the missing frame data) is received by the decoder, the decoder will conclude that interpolation is needed and respond by either repeating an adjacent frame (e.g., if the frames are substantially different) or create a middling alternative based on both preceding and subsequent frame data using a single camera's frame sequence. If the interpolation information contains only a marker, the decoder will immediately do the same as above without having to indirectly reach the conclusion that interpolation is needed. The interpolation information may also contain further items that either direct or assist the decoder in performing a desired interpolation. That is, for example, the interpolation information may also contain interpolation instructions, frame reference identifiers (that identify a frame or frames from which the decoder can base its interpolation), interpolation parameters (weighting factors, interpolation approaches to be used, regional/area definitions, etc.), filters (to be applied in the interpolation process) and any accompanying data (e.g., texture maps, etc.) that may enhance the interpolation process.
- For example, images captured by one camera might be very close to those created at a brief time later by another camera. Thus, instead of using merely adjacent reference frames for interpolation (such as the three frame sequences with a missing middle frame approach mentioned above), an encoder may choose to send interpolation information that identifies for use by the decoder one or more frames selected from other camera's frame sequences and other possibly non-adjacent frames from within the same camera's frame sequence. The interpolation information may also include the various interpolation parameters mentioned above, interpolation approaches to be used, regional definitions in which such approaches and frames are used, filters and data.
- A decoder can perform all or any portion of the above in association with a full frame or sections thereof. For instance, the decoder can perform interpolation operations on respective regions of a single frame, and interpolation per region can be different from that of another region.
-
FIGS. 11-16 are flow charts that illustrate several of many approaches for carrying out at least a portion of such decoder interpolation processing. more specifically, as shown inFIG. 11 ,flowchart 1100 beginsstep 1102. Instep 1102, both a first encoded portion of a first encoded sequence of frames that represent a first perspective view and a second encoded portion of a second encoded sequence of frames that represent a second perspective view are received. In the implementation example ofFIG. 10 , theprocessing circuitry 1004 receives the first encoded portion and the second encoded portion through theinput circuitry 1002. - At
step 1104, the first encoded portion and the second encoded portion are decoded. The decoding involves responding to an interpolation marker by generating frame data to replace the interpolation marker. In the implementation example ofFIG. 10 , theprocessing circuitry 1004 decodes the first encoded portion and the second encoded portion. - At
step 1106, a decoded representation of the encoded three-dimensional video content is delivered. In the implementation example ofFIG. 10 , theprocessing circuitry 1004 delivers the decoded representation of the encoded three-dimensional video content (e.g., decoded three-dimensional video content 1012) through theoutput circuitry 1006. - In some example embodiments, one or
more steps flowchart 1100 may not be performed. Moreover, steps in addition to or in lieu ofsteps - Instead of performing
step 1104 offlowchart 1100, the steps shown inflowchart 1200,flowchart 1300, orflowchart 1400 shown in respectiveFIGS. 12-14 may be performed. A shown inFIG. 12 ,flowchart 1200 begins atstep 1202. Instep 1202, a determination is made that a number of perspective views that a display is capable of processing is greater than a number of perspective views that is initially represented by the encoded three-dimensional video content. In the implementation example ofFIG. 10 , theprocessing circuitry 1004 determines that the number of perspective views that the display is capable of processing is greater than the number of perspective views that is initially represented by the encoded three-dimensional video content. - At
step 1204, an interpolation request is provided to an encoder. The interpolation request requests inclusion of an interpolation marker in the encoded three-dimensional video content. In the implementation example ofFIG. 10 , theprocessing circuitry 1004 provides the interpolation request through theoutput circuitry 1006. - At
step 1206, an interpolation is performed between a decoded version of the first encoded portion and a decoded version of the second encoded portion to generate frame data that corresponds to a third sequence of frames that represent a third perspective view to replace the interpolation marker. The third perspective view is not initially represented by the encoded three-dimensional video content. In the implementation example ofFIG. 10 , theprocessing circuitry 1004 interpolates between the decoded version of the first encoded portion and the decoded version of the second encoded portion to generate the frame data. - As shown in
FIG. 13 ,flowchart 1300 begins atstep 1302. Instep 1302, an interpolation instruction is received from an upstream device. In the implementation example ofFIG. 10 , theprocessing circuitry 1004 receives the interpolation instructions through theinput circuitry 1002. - At
step 1304, the first encoded portion and the second encoded portion are decoded. The decoding involves responding to an interpolation marker by generating frame data to replace the interpolation marker in accordance with the interpolation instruction. In the implementation example ofFIG. 10 , theprocessing circuitry 1004 decodes the first encoded portion and the second encoded portion. - As shown in
FIG. 14 ,flowchart 1400 begins atstep 1402. Instep 1402, the first encoded portion is decoded to provide a first decoded portion of a first decoded sequence of frames that represents the first perspective view. In the implementation example ofFIG. 10 , theprocessing circuitry 1004 decodes the first encoded portion. - At
step 1404, the second encoded portion is decoded to provide decoded data that represents the second perspective view, the decoded data including an interpolation marker. In the implementation example ofFIG. 10 , theprocessing circuitry 1004 decodes the second encoded portion. - At
step 1406, an interpolation is performed between the first decoded portion and a third decoded portion of a third decoded sequence of frames that represents a third perspective view to generate frame data to replace the interpolation marker in the decoded data. In the implementation example ofFIG. 10 , theprocessing circuitry 1004 interpolates between the first decoded portion and the third decoded portion to generate the frame data. - Instead of performing
step 1406 offlowchart 1400, the steps shown inflowchart 1500 shown inFIG. 15 may be performed. A shown inFIG. 15 ,flowchart 1500 begins atstep 1502. Instep 1502, a weight indicator is received from an upstream device. The weight indicator specifies an extent to which the first decoded portion is to be weighed with respect to the third decoded portion. In the implementation example ofFIG. 10 , theprocessing circuitry 1004 receives the weight indicator from the upstream device throughinput circuitry 1002. - At
step 1504, an interpolation is performed between the first decoded portion and a third decoded portion of a third decoded sequence of frames that represents a third perspective view to generate frame data to replace the interpolation marker in the decoded data based on the extent that is specified by the weight indicator. In the implementation example ofFIG. 10 , theprocessing circuitry 1004 interpolates between the first decoded portion and the third decoded portion to generate the frame data. - As shown in
FIG. 16 ,flowchart 1600 begins atstep 1602. Instep 1602, at least a portion of first encoded data is retrieved that relates to a first sequence of frames representing a first perspective view. In the implementation example ofFIG. 10 , theprocessing circuitry 1004 retrieves the at least one portion of the first encoded data. - At
step 1604, at least a portion of second encoded data is received that relates to a second sequence of frames representing a second perspective view. In the implementation example ofFIG. 10 , theprocessing circuitry 1004 retrieves the at least one portion of the second encoded data. - At
step 1606, a first frame is identified within the first sequence of frames that is not directly represented by the first encoded data retrieved. For example, an interpolation marker that is associated with the first frame may be identified. In accordance with this example, the interpolation marker may be accompanied by interpolation instructions. In another example, the first frame includes a missing frame. In the implementation example ofFIG. 10 , theprocessing circuitry 1004 identifies the first frame. - At
step 1608, an interpolation of the first frame is produced. For example, the interpolation may be based at least in part on the second encoded data. In another example, production of the interpolation of the first frame may be based on at least the portion of the first encoded data and at least a portion of third encoded data that relates to a third sequence of frames representing a third perspective view based on a weight indicator. In accordance with this example, the weight indicator specifies an extent to which at least the portion of the first encoded data is to be weighed with respect to at least the portion of the third encoded data. In the implementation example ofFIG. 10 , theprocessing circuitry 1004 produces the interpolation of the first frame. -
FIGS. 17-20 illustrateexemplary interpolation techniques interpolation techniques techniques techniques - Referring to
FIG. 17 ,technique 1700 is directed to staging 3D8 content from 2D up according to an embodiment.Technique 1700 will be described with reference to original 3D8 content 1702,2D content 1704,3D2 content 1706,3D4 content 1708, and3D8 content 1710. The original content used to illustratetechnique 1700 is 3D8 content, which includes eight video streams (labeled as 1-8) that represent respective views of a video event. - A single stream of the original 3D8 content 1702 may be used to provide 2D content. As shown in
FIG. 17 ,stream 3 of the original 3D8 content 1702 is used to provide2D content 1704 for illustrative purposes. Internal interframe compression referencing is used with respect to stream 3 of the original 3D8 content 1702 to generatestream 3 of the2D content 1704. However, no other streams that are included in the original 3D8 content 1702 are referenced to generatestream 3 of the2D content 1704. - Internal interframe compression referencing is a technique in which differences between frames (e.g., adjacent frames) that are included in a stream of video content are used to represent the frames in that stream. For example, a first frame in the stream may be designated as a reference frame with other frames in the stream being designated as dependent frames. In accordance with this example, the reference frame may be represented by data that is sufficient to independently define the reference frame, while the dependent frames may be represented by difference data. The difference data that represents each dependent frame may use data that represents one or more of the other frames in the stream to generate data that is sufficient to independently define that dependent frame.
- Two streams of the original 3D8 content 1702 may be used to provide 3D2 content. As shown in
FIG. 17 ,streams 3D2 content 1706 for illustrative purposes. Internal interframe compression referencing is used with respect to stream 3 of the original 3D8 content 1702 to generatestream 3 of the3D2 content 1706, as described above with reference to2D content 1704.Stream 7 of the3D2 content 1706 is generated using internal interframe compression referencing with respect to stream 7 of the original 3D8 content 1702 andstream 3 of the2D content 1704 for referencing. - Four streams of the original 3D8 content 1702 may be used to provide 3D4 content. As shown in
FIG. 17 ,streams 3D4 content 1708 are generated as described above with reference to3D2 content 1706.Streams 3D4 content 1708 are generated usingstreams streams 3D2 content 1706 for referencing. - All eight streams of the original 3D8 content 1702 may be used to provide 3D8 content. As shown in
FIG. 17 ,streams 3D8 content 1710 are generated as described above with reference to3D4 content 1708.Streams 3D8 content 1710 are generated using any of a plurality of streams, which includesstreams 3D4 content 1708 andstream - Referring to
FIG. 18 ,technique 1800 is directed to a limited referencing configuration according to an embodiment.Technique 1800 will be described with reference tooriginal 3D8 content 1802,2D content 1804,3D2 content 1806, 3D4 content 1808, and3D8 content 1810. The original content used to illustratetechnique 1800 is 3D8 content, which includes eight video streams (labeled as 1-8) that represent respective views of a video event. The2D content 1804 and the3D2 content 1806 are generated in the same manner as the2D content 1704 and the3D2 content 1706 described above with reference toFIG. 17 . However, the manner in which the 3D4 content 1808 and the3D8 content 1810 are generated differs from the manner in which the3D4 content 1708 and the3D8 content 1710 are generated. - As shown in
FIG. 18 ,streams 3D4 content 1708 ofFIG. 17 . However,stream 1 of the 3D4 content 1808 is generated usingstream 1 of theoriginal 3D8 content 1802 andstreams 3D2 content 1806 for referencing. Furthermore,stream 5 of the 3D4 content 1808 is generated usingstream 5 of theoriginal 3D8 content 1802 andstreams 3D2 content 1806 for referencing. -
Streams 3D8 content 1810 are generated as described above with reference to the 3D4 content 1808.Streams 3D8 content 1810 are generated using internal interframe compression referencing and streams 1, 3, 5, and 7 of the 3D4 content 1808 for referencing. - Referring to
FIG. 19 ,technique 1900 is directed to interpolation of lost frame data to maintain image stability according to an embodiment.Technique 1900 will be described with reference tooriginal 3D8 content 1902,2D content 1904, and3D2 content 1906. The original content used to illustratetechnique 1900 is 3D8 content, which includes eight video streams (labeled as 1-8) that represent respective views of a video event. The2D content 1904 is generated in the same manner as the2D content 1704 described above with reference toFIG. 17 . However, the manner in which the3D2 content 1906 is generated differs from the manner in which the3D2 content 1708 is generated. - As shown in
FIG. 19 ,stream 3 of the3D2 content 1906 is generated as described above with reference to3D2 content 1706 ofFIG. 17 . Moreover,stream 7 of the3D2 content 1906 is generated usingstream 3 of the2D content 1904 for referencing.Stream 7 of the3D2 content 1906 is generated further using internal interframe compression referencing if a previous frame and/or a future frame ofstream 3 of the2D content 1904 is similar to a current frame ofstream 3 of the2D content 1904. - Referring to
FIG. 20 ,technique 2000 is directed to interpolation to provide a number of views that is greater than a number of views that are represented by received video content according to an embodiment.Technique 2000 will be described with reference tooriginal 3D4 content 2002,2D content 2004,3D2 content 2006,3D4 content 2008, and3D8 content 2010. The original content used to illustratetechnique 2000 is 3D4 content, which includes four video streams (labeled as 1-4) that represent respective views of a video event. - A single stream of the
original 3D4 content 2002 may be used to provide 2D content. As shown inFIG. 20 ,stream 3 of theoriginal 3D4 content 2002 is used to provide2D content 2004 for illustrative purposes. Internal interframe compression referencing is used with respect to stream 3 of theoriginal 3D4 content 2002 to generatestream 3 of the2D content 2004. However, no other streams that are included in theoriginal 3D4 content 2002 are referenced to generatestream 3 of the2D content 2004. - Two streams of the
original 3D4 content 2002 may be used to provide 3D2 content. As shown inFIG. 20 ,streams original 3D4 content 2002 are used to provide3D2 content 2006 for illustrative purposes.Stream 3 of the3D2 content 2006 is generated as described above with reference to the2D content 2004.Stream 1 of the3D2 content 2006 is generated using internal interframe compression referencing with respect to stream 1 of theoriginal 3D4 content 2002 andstream 3 of the2D content 2004 for referencing. - All four streams of the
original 3D4 content 2002 may be used to provide 3D4 content. As shown inFIG. 20 ,streams 3D4 content 2008 are generated as described above with reference to3D2 content 2006.Streams 3D4 content 2008 are generated usingstreams original 3D4 content 2002 andstreams 3D2 content 2006 for referencing. - All four streams of the
original 3D4 content 2002 may be used to provide 3D8 content. As shown inFIG. 20 , streams 1-4 of the3D8 content 1710 are generated as described above with reference to3D4 content 2008. Streams 5-8 of the3D8 content 2010 are entirely interpolated using nearest neighbor streams. - Streams that are used by a decoder for purposes of interpolation must be available to the decoder. For example, when external interpolation referencing is used to encode data, only frame sequences (perspective views) that are allowed to be referenced for encoding purposes may be used for interpolation purposes. External interpolation referencing involves referencing frames to be used for interpolation that can be found in frame sequences outside of a current frame sequence (i.e., from a different perspective view).
- In some embodiments, streams are encoded using hierarchical encoding techniques, as described in commonly-owned, co-pending U.S. patent application Ser. No. ______ (Atty. Docket No. A05.01330000), filed on even date herewith and entitled “Hierarchical Video Compression Supporting Selective Delivery of Two-Dimensional and Three-Dimensional Video Content,” the entirety of which is incorporated by reference herein. Such embodiments enable a subset of a total number of streams to be received and decoded at the decoder, wherein some of the streams received may be decoded by referencing some of the other streams received. In accordance with these embodiments, none of the received streams rely on non-received streams for decoding. Accordingly, in these embodiments, interpolation referencing is limited to received streams.
- III. Exemplary Electronic Device Implementations
- Embodiments may be implemented in hardware, software, firmware, or any combination thereof. For example,
encoding system 208,decoding system 214,display circuitry 216,input circuitry 302,processing circuitry 304,output circuitry 306,input circuitry 1002,processing circuitry 1004, and/oroutput circuitry 1006 may be implemented as hardware logic/electrical circuitry. In another example,encoding system 208,decoding system 214,display circuitry 216,input circuitry 302,processing circuitry 304,output circuitry 306,input circuitry 1002,processing circuitry 1004, and/oroutput circuitry 1006 may be implemented as computer program code configured to be executed in one or more processors. - For instance,
FIG. 21 shows a block diagram of an exemplary implementation ofelectronic device 2100 according to an embodiment. In embodiments,electronic device 2100 may include one or more of the elements shown inFIG. 21 . As shown in the example ofFIG. 21 ,electronic device 2100 may include one or more processors (also called central processing units, or CPUs), such as aprocessor 2104.Processor 2104 is connected to acommunication infrastructure 2102, such as a communication bus. In some embodiments,processor 2104 can simultaneously operate multiple computing threads. -
Electronic device 2100 also includes a primary ormain memory 2106, such as random access memory (RAM).Main memory 2106 has stored therein controllogic 2128A (computer software), and data. -
Electronic device 2100 also includes one or moresecondary storage devices 2110.Secondary storage devices 2110 include, for example, ahard disk drive 2112 and/or a removable storage device or drive 2114, as well as other types of storage devices, such as memory cards and memory sticks. For instance,electronic device 2100 may include an industry standard interface, such a universal serial bus (USB) interface for interfacing with devices such as a memory stick.Removable storage drive 2114 represents a floppy disk drive, a magnetic tape drive, a compact disk drive, an optical storage device, tape backup, etc. -
Removable storage drive 2114 interacts with aremovable storage unit 2116.Removable storage unit 2116 includes a computer useable orreadable storage medium 2124 having stored thereincomputer software 2128B (control logic) and/or data.Removable storage unit 2116 represents a floppy disk, magnetic tape, compact disk, DVD, optical storage disk, or any other computer data storage device.Removable storage drive 2114 reads from and/or writes toremovable storage unit 2116 in a well known manner. -
Electronic device 2100 further includes a communication ornetwork interface 2118.Communication interface 2118 enables theelectronic device 2100 to communicate with remote devices. For example,communication interface 2118 allowselectronic device 2100 to communicate over communication networks or mediums 2142 (representing a form of a computer useable or readable medium), such as LANs, WANs, the Internet, etc.Network interface 2118 may interface with remote sites or networks via wired or wireless connections. -
Control logic 2128C may be transmitted to and fromelectronic device 2100 via thecommunication medium 2122. - Any apparatus or manufacture comprising a computer useable or readable medium having control logic (software) stored therein is referred to herein as a computer program product or program storage device. This includes, but is not limited to,
electronic device 2100,main memory 2106,secondary storage devices 2110, andremovable storage unit 2116. Such computer program products, having control logic stored therein that, when executed by one or more data processing devices, cause such data processing devices to operate as described herein, represent embodiments of the invention. - Devices in which embodiments may be implemented may include storage, such as storage drives, memory devices, and further types of computer-readable media. Examples of such computer-readable storage media include a hard disk, a removable magnetic disk, a removable optical disk, flash memory cards, digital video disks, random access memories (RAMs), read only memories (ROM), and the like. As used herein, the terms “computer program medium” and “computer-readable medium” are used to generally refer to the hard disk associated with a hard disk drive, a removable magnetic disk, a removable optical disk (e.g., CDROMs, DVDs, etc.), zip disks, tapes, magnetic storage devices, MEMS (micro-electromechanical systems) storage, nanotechnology-based storage devices, as well as other media such as flash memory cards, digital video discs, RAM devices, ROM devices, and the like. Such computer-readable storage media may store program modules that include computer program logic for
encoding system 208,decoding system 214,display circuitry 216,input circuitry 302,processing circuitry 304,output circuitry 306,input circuitry 1002,processing circuitry 1004,output circuitry 1006,flowchart 400,flowchart 500,flowchart 600,flowchart 700,flowchart 800,flowchart 900,flowchart 1100,flowchart 1200,flowchart 1300,flowchart 1400,flowchart 1500, flowchart 1600 (including any one or more steps offlowcharts - The invention can be put into practice using software, firmware, and/or hardware implementations other than those described herein. Any software, firmware, and hardware implementations suitable for performing the functions described herein can be used.
- As described herein,
electronic device 2100 may be implemented in association with a variety of types of display devices. For instance,electronic device 2100 may be one of a variety of types of media devices, such as a stand-alone display (e.g., a television display such as flat panel display, etc.), a computer, a game console, a set top box, a digital video recorder (DVR), other electronic device mentioned elsewhere herein, etc. Media content that is delivered in two-dimensional or three-dimensional form according to embodiments described herein may be stored locally or received from remote locations. For instance, such media content may be locally stored for playback (replay TV, DVR), may be stored in removable memory (e.g. DVDs, memory sticks, etc.), may be received on wireless and/or wired pathways through a network such as a home network, through Internet download streaming, through a cable network, a satellite network, and/or a fiber network, etc. For instance,FIG. 21 shows afirst media content 2130A that is stored inhard disk drive 2112, asecond media content 2130B that is stored instorage medium 2124 ofremovable storage unit 2116, and athird media content 2130C that may be remotely stored and received overcommunication medium 2122 bycommunication interface 2118. Media content 2130 may be stored and/or received in these manners and/or in other ways. - IV. Conclusion
- While various embodiments have been described above, it should be understood that they have been presented by way of example only, and not limitation. It will be understood by those skilled in the relevant arts) that various changes in form and details may be made to the embodiments described herein without departing from the spirit and scope of the invention. Accordingly, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/982,248 US20110157315A1 (en) | 2009-12-31 | 2010-12-30 | Interpolation of three-dimensional video content |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US29181809P | 2009-12-31 | 2009-12-31 | |
US30311910P | 2010-02-10 | 2010-02-10 | |
US12/982,248 US20110157315A1 (en) | 2009-12-31 | 2010-12-30 | Interpolation of three-dimensional video content |
Publications (1)
Publication Number | Publication Date |
---|---|
US20110157315A1 true US20110157315A1 (en) | 2011-06-30 |
Family
ID=43797724
Family Applications (27)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/774,225 Abandoned US20110157322A1 (en) | 2009-12-31 | 2010-05-05 | Controlling a pixel array to support an adaptable light manipulator |
US12/774,307 Active 2032-01-14 US8964013B2 (en) | 2009-12-31 | 2010-05-05 | Display with elastic light manipulator |
US12/845,440 Abandoned US20110157697A1 (en) | 2009-12-31 | 2010-07-28 | Adaptable parallax barrier supporting mixed 2d and stereoscopic 3d display regions |
US12/845,409 Abandoned US20110157696A1 (en) | 2009-12-31 | 2010-07-28 | Display with adaptable parallax barrier |
US12/845,461 Active 2031-10-30 US8767050B2 (en) | 2009-12-31 | 2010-07-28 | Display supporting multiple simultaneous 3D views |
US12/982,309 Active 2033-05-02 US9204138B2 (en) | 2009-12-31 | 2010-12-30 | User controlled regional display of mixed two and three dimensional content |
US12/982,088 Active 2032-01-06 US9066092B2 (en) | 2009-12-31 | 2010-12-30 | Communication infrastructure including simultaneous video pathways for multi-viewer support |
US12/982,053 Abandoned US20110157309A1 (en) | 2009-12-31 | 2010-12-30 | Hierarchical video compression supporting selective delivery of two-dimensional and three-dimensional video content |
US12/982,248 Abandoned US20110157315A1 (en) | 2009-12-31 | 2010-12-30 | Interpolation of three-dimensional video content |
US12/982,140 Abandoned US20110161843A1 (en) | 2009-12-31 | 2010-12-30 | Internet browser and associated content definition supporting mixed two and three dimensional displays |
US12/982,199 Active 2032-09-27 US8988506B2 (en) | 2009-12-31 | 2010-12-30 | Transcoder supporting selective delivery of 2D, stereoscopic 3D, and multi-view 3D content from source video |
US12/982,124 Active 2033-02-08 US9124885B2 (en) | 2009-12-31 | 2010-12-30 | Operating system supporting mixed 2D, stereoscopic 3D and multi-view 3D displays |
US12/982,047 Abandoned US20110157330A1 (en) | 2009-12-31 | 2010-12-30 | 2d/3d projection system |
US12/982,377 Abandoned US20110157327A1 (en) | 2009-12-31 | 2010-12-30 | 3d audio delivery accompanying 3d display supported by viewer/listener position and orientation tracking |
US12/982,330 Abandoned US20110157326A1 (en) | 2009-12-31 | 2010-12-30 | Multi-path and multi-source 3d content storage, retrieval, and delivery |
US12/982,020 Abandoned US20110157257A1 (en) | 2009-12-31 | 2010-12-30 | Backlighting array supporting adaptable parallax barrier |
US12/982,069 Active 2033-05-07 US8922545B2 (en) | 2009-12-31 | 2010-12-30 | Three-dimensional display system with adaptation based on viewing reference of viewer(s) |
US12/982,156 Active 2035-11-09 US9654767B2 (en) | 2009-12-31 | 2010-12-30 | Programming architecture supporting mixed two and three dimensional displays |
US12/982,212 Active 2032-04-05 US9013546B2 (en) | 2009-12-31 | 2010-12-30 | Adaptable media stream servicing two and three dimensional content |
US12/982,273 Active 2032-08-13 US9979954B2 (en) | 2009-12-31 | 2010-12-30 | Eyewear with time shared viewing supporting delivery of differing content to multiple viewers |
US12/982,173 Active 2033-08-22 US9143770B2 (en) | 2009-12-31 | 2010-12-30 | Application programming interface supporting mixed two and three dimensional displays |
US12/982,062 Active 2032-06-13 US8687042B2 (en) | 2009-12-31 | 2010-12-30 | Set-top box circuitry supporting 2D and 3D content reductions to accommodate viewing environment constraints |
US12/982,031 Active 2032-12-14 US9019263B2 (en) | 2009-12-31 | 2010-12-30 | Coordinated driving of adaptable light manipulator, backlighting and pixel array in support of adaptable 2D and 3D displays |
US12/982,362 Active 2031-02-05 US9049440B2 (en) | 2009-12-31 | 2010-12-30 | Independent viewer tailoring of same media source content via a common 2D-3D display |
US14/504,095 Abandoned US20150015668A1 (en) | 2009-12-31 | 2014-10-01 | Three-dimensional display system with adaptation based on viewing reference of viewer(s) |
US14/616,130 Abandoned US20150156473A1 (en) | 2009-12-31 | 2015-02-06 | Transcoder supporting selective delivery of 2d, stereoscopic 3d, and multi-view 3d content from source video |
US14/723,922 Abandoned US20150264341A1 (en) | 2009-12-31 | 2015-05-28 | Communication infrastructure including simultaneous video pathways for multi-viewer support |
Family Applications Before (8)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/774,225 Abandoned US20110157322A1 (en) | 2009-12-31 | 2010-05-05 | Controlling a pixel array to support an adaptable light manipulator |
US12/774,307 Active 2032-01-14 US8964013B2 (en) | 2009-12-31 | 2010-05-05 | Display with elastic light manipulator |
US12/845,440 Abandoned US20110157697A1 (en) | 2009-12-31 | 2010-07-28 | Adaptable parallax barrier supporting mixed 2d and stereoscopic 3d display regions |
US12/845,409 Abandoned US20110157696A1 (en) | 2009-12-31 | 2010-07-28 | Display with adaptable parallax barrier |
US12/845,461 Active 2031-10-30 US8767050B2 (en) | 2009-12-31 | 2010-07-28 | Display supporting multiple simultaneous 3D views |
US12/982,309 Active 2033-05-02 US9204138B2 (en) | 2009-12-31 | 2010-12-30 | User controlled regional display of mixed two and three dimensional content |
US12/982,088 Active 2032-01-06 US9066092B2 (en) | 2009-12-31 | 2010-12-30 | Communication infrastructure including simultaneous video pathways for multi-viewer support |
US12/982,053 Abandoned US20110157309A1 (en) | 2009-12-31 | 2010-12-30 | Hierarchical video compression supporting selective delivery of two-dimensional and three-dimensional video content |
Family Applications After (18)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/982,140 Abandoned US20110161843A1 (en) | 2009-12-31 | 2010-12-30 | Internet browser and associated content definition supporting mixed two and three dimensional displays |
US12/982,199 Active 2032-09-27 US8988506B2 (en) | 2009-12-31 | 2010-12-30 | Transcoder supporting selective delivery of 2D, stereoscopic 3D, and multi-view 3D content from source video |
US12/982,124 Active 2033-02-08 US9124885B2 (en) | 2009-12-31 | 2010-12-30 | Operating system supporting mixed 2D, stereoscopic 3D and multi-view 3D displays |
US12/982,047 Abandoned US20110157330A1 (en) | 2009-12-31 | 2010-12-30 | 2d/3d projection system |
US12/982,377 Abandoned US20110157327A1 (en) | 2009-12-31 | 2010-12-30 | 3d audio delivery accompanying 3d display supported by viewer/listener position and orientation tracking |
US12/982,330 Abandoned US20110157326A1 (en) | 2009-12-31 | 2010-12-30 | Multi-path and multi-source 3d content storage, retrieval, and delivery |
US12/982,020 Abandoned US20110157257A1 (en) | 2009-12-31 | 2010-12-30 | Backlighting array supporting adaptable parallax barrier |
US12/982,069 Active 2033-05-07 US8922545B2 (en) | 2009-12-31 | 2010-12-30 | Three-dimensional display system with adaptation based on viewing reference of viewer(s) |
US12/982,156 Active 2035-11-09 US9654767B2 (en) | 2009-12-31 | 2010-12-30 | Programming architecture supporting mixed two and three dimensional displays |
US12/982,212 Active 2032-04-05 US9013546B2 (en) | 2009-12-31 | 2010-12-30 | Adaptable media stream servicing two and three dimensional content |
US12/982,273 Active 2032-08-13 US9979954B2 (en) | 2009-12-31 | 2010-12-30 | Eyewear with time shared viewing supporting delivery of differing content to multiple viewers |
US12/982,173 Active 2033-08-22 US9143770B2 (en) | 2009-12-31 | 2010-12-30 | Application programming interface supporting mixed two and three dimensional displays |
US12/982,062 Active 2032-06-13 US8687042B2 (en) | 2009-12-31 | 2010-12-30 | Set-top box circuitry supporting 2D and 3D content reductions to accommodate viewing environment constraints |
US12/982,031 Active 2032-12-14 US9019263B2 (en) | 2009-12-31 | 2010-12-30 | Coordinated driving of adaptable light manipulator, backlighting and pixel array in support of adaptable 2D and 3D displays |
US12/982,362 Active 2031-02-05 US9049440B2 (en) | 2009-12-31 | 2010-12-30 | Independent viewer tailoring of same media source content via a common 2D-3D display |
US14/504,095 Abandoned US20150015668A1 (en) | 2009-12-31 | 2014-10-01 | Three-dimensional display system with adaptation based on viewing reference of viewer(s) |
US14/616,130 Abandoned US20150156473A1 (en) | 2009-12-31 | 2015-02-06 | Transcoder supporting selective delivery of 2d, stereoscopic 3d, and multi-view 3d content from source video |
US14/723,922 Abandoned US20150264341A1 (en) | 2009-12-31 | 2015-05-28 | Communication infrastructure including simultaneous video pathways for multi-viewer support |
Country Status (5)
Country | Link |
---|---|
US (27) | US20110157322A1 (en) |
EP (4) | EP2357508A1 (en) |
CN (3) | CN102183840A (en) |
HK (1) | HK1161754A1 (en) |
TW (3) | TW201142356A (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110164188A1 (en) * | 2009-12-31 | 2011-07-07 | Broadcom Corporation | Remote control with integrated position, viewer identification and optical and audio test |
US20110164115A1 (en) * | 2009-12-31 | 2011-07-07 | Broadcom Corporation | Transcoder supporting selective delivery of 2d, stereoscopic 3d, and multi-view 3d content from source video |
US20120194676A1 (en) * | 2009-10-07 | 2012-08-02 | Robert Laganiere | Video analytics method and system |
US20130033576A1 (en) * | 2011-08-03 | 2013-02-07 | Myokan Yoshihiro | Image processing device and method, and program |
US20130050572A1 (en) * | 2011-08-24 | 2013-02-28 | Ati Technologies Ulc | Method and apparatus for providing dropped picture image processing |
US20140044412A1 (en) * | 2012-08-08 | 2014-02-13 | Samsung Electronics Co., Ltd. | Terminal and method for generating live image |
US8854531B2 (en) | 2009-12-31 | 2014-10-07 | Broadcom Corporation | Multiple remote controllers that each simultaneously controls a different visual presentation of a 2D/3D display |
US20150085076A1 (en) * | 2013-09-24 | 2015-03-26 | Amazon Techologies, Inc. | Approaches for simulating three-dimensional views |
US9247286B2 (en) | 2009-12-31 | 2016-01-26 | Broadcom Corporation | Frame formatting supporting mixed two and three dimensional video data communication |
US20160255322A1 (en) * | 2013-10-07 | 2016-09-01 | Vid Scale, Inc. | User adaptive 3d video rendering and delivery |
US9667919B2 (en) | 2012-08-02 | 2017-05-30 | Iwatchlife Inc. | Method and system for anonymous video analytics processing |
US10375375B2 (en) * | 2017-05-15 | 2019-08-06 | Lg Electronics Inc. | Method of providing fixed region information or offset region information for subtitle in virtual reality system and device for controlling the same |
US10757324B2 (en) | 2018-08-03 | 2020-08-25 | Semiconductor Components Industries, Llc | Transform processors for gradually switching between image transforms |
US10802324B2 (en) | 2017-03-14 | 2020-10-13 | Boe Technology Group Co., Ltd. | Double vision display method and device |
Families Citing this family (501)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8416217B1 (en) * | 2002-11-04 | 2013-04-09 | Neonode Inc. | Light-based finger gesture user interface |
US9015736B2 (en) * | 2005-12-29 | 2015-04-21 | Rovi Guides, Inc. | Systems and methods for episode tracking in an interactive media environment |
US8121361B2 (en) | 2006-05-19 | 2012-02-21 | The Queen's Medical Center | Motion tracking system for real time adaptive imaging and spectroscopy |
FR2906899B1 (en) * | 2006-10-05 | 2009-01-16 | Essilor Int | DISPLAY DEVICE FOR STEREOSCOPIC VISUALIZATION. |
JP2008106185A (en) * | 2006-10-27 | 2008-05-08 | Shin Etsu Chem Co Ltd | Method for adhering thermally conductive silicone composition, primer for adhesion of thermally conductive silicone composition and method for production of adhesion composite of thermally conductive silicone composition |
US8570423B2 (en) * | 2009-01-28 | 2013-10-29 | Hewlett-Packard Development Company, L.P. | Systems for performing visual collaboration between remotely situated participants |
US8775023B2 (en) | 2009-02-15 | 2014-07-08 | Neanode Inc. | Light-based touch controls on a steering wheel and dashboard |
EP2256620A1 (en) * | 2009-05-29 | 2010-12-01 | Koninklijke Philips Electronics N.V. | Picture selection method for modular lighting system |
US8125418B2 (en) * | 2009-06-26 | 2012-02-28 | Global Oled Technology Llc | Passive-matrix chiplet drivers for displays |
US9407908B2 (en) * | 2009-08-20 | 2016-08-02 | Lg Electronics Inc. | Image display apparatus and method for operating the same |
JP5187639B2 (en) * | 2009-08-28 | 2013-04-24 | 独立行政法人情報通信研究機構 | 3D display |
US20110080472A1 (en) * | 2009-10-02 | 2011-04-07 | Eric Gagneraud | Autostereoscopic status display |
CN102474632A (en) * | 2009-12-08 | 2012-05-23 | 美国博通公司 | Method and system for handling multiple 3-d video formats |
US20110143769A1 (en) * | 2009-12-16 | 2011-06-16 | Microsoft Corporation | Dual display mobile communication device |
CA2889724C (en) | 2009-12-21 | 2021-06-08 | Kik Interactive Inc. | Systems and methods for accessing and controlling media stored remotely |
US8684531B2 (en) * | 2009-12-28 | 2014-04-01 | Vision3D Technologies, Llc | Stereoscopic display device projecting parallax image and adjusting amount of parallax |
US20110187839A1 (en) * | 2010-02-01 | 2011-08-04 | VIZIO Inc. | Frame based three-dimensional encoding method |
US20110191328A1 (en) * | 2010-02-03 | 2011-08-04 | Vernon Todd H | System and method for extracting representative media content from an online document |
US20110202845A1 (en) * | 2010-02-17 | 2011-08-18 | Anthony Jon Mountjoy | System and method for generating and distributing three dimensional interactive content |
JP2011199853A (en) * | 2010-02-23 | 2011-10-06 | Panasonic Corp | Three-dimensional image reproducing apparatus |
DE102010009737A1 (en) * | 2010-03-01 | 2011-09-01 | Institut für Rundfunktechnik GmbH | Method and arrangement for reproducing 3D image content |
JP5462672B2 (en) * | 2010-03-16 | 2014-04-02 | 株式会社ジャパンディスプレイ | Display device and electronic device |
US8634873B2 (en) * | 2010-03-17 | 2014-01-21 | Microsoft Corporation | Mobile communication device having multiple, interchangeable second devices |
KR101289269B1 (en) * | 2010-03-23 | 2013-07-24 | 한국전자통신연구원 | An apparatus and method for displaying image data in image system |
KR20110109565A (en) * | 2010-03-31 | 2011-10-06 | 삼성전자주식회사 | Backlight unit, 3d display having the same and method of making 3d image |
US10448083B2 (en) * | 2010-04-06 | 2019-10-15 | Comcast Cable Communications, Llc | Streaming and rendering of 3-dimensional video |
KR20110115806A (en) * | 2010-04-16 | 2011-10-24 | 삼성전자주식회사 | Display apparatus and 3d glasses, and display system including the same |
WO2011132422A1 (en) * | 2010-04-21 | 2011-10-27 | パナソニック株式会社 | Three-dimensional video display device and three-dimensional video display method |
US8667533B2 (en) * | 2010-04-22 | 2014-03-04 | Microsoft Corporation | Customizing streaming content presentation |
US9271052B2 (en) * | 2010-05-10 | 2016-02-23 | Comcast Cable Communications, Llc | Grid encoded media asset data |
US9030536B2 (en) | 2010-06-04 | 2015-05-12 | At&T Intellectual Property I, Lp | Apparatus and method for presenting media content |
JP5510097B2 (en) * | 2010-06-16 | 2014-06-04 | ソニー株式会社 | Signal transmission method, signal transmission device, and signal reception device |
US9225975B2 (en) | 2010-06-21 | 2015-12-29 | Microsoft Technology Licensing, Llc | Optimization of a multi-view display |
US10089937B2 (en) * | 2010-06-21 | 2018-10-02 | Microsoft Technology Licensing, Llc | Spatial and temporal multiplexing display |
KR20110139497A (en) * | 2010-06-23 | 2011-12-29 | 삼성전자주식회사 | Display apparatus and method for displaying thereof |
JP2012013980A (en) * | 2010-07-01 | 2012-01-19 | Sony Corp | Stereoscopic display device and display drive circuit |
US9049426B2 (en) * | 2010-07-07 | 2015-06-02 | At&T Intellectual Property I, Lp | Apparatus and method for distributing three dimensional media content |
US8670070B2 (en) * | 2010-07-15 | 2014-03-11 | Broadcom Corporation | Method and system for achieving better picture quality in various zoom modes |
US9032470B2 (en) | 2010-07-20 | 2015-05-12 | At&T Intellectual Property I, Lp | Apparatus for adapting a presentation of media content according to a position of a viewing apparatus |
US9232274B2 (en) | 2010-07-20 | 2016-01-05 | At&T Intellectual Property I, L.P. | Apparatus for adapting a presentation of media content to a requesting device |
JP2012034138A (en) * | 2010-07-29 | 2012-02-16 | Toshiba Corp | Signal processing apparatus and signal processing method |
KR20120020627A (en) * | 2010-08-30 | 2012-03-08 | 삼성전자주식회사 | Apparatus and method for image processing using 3d image format |
TW201227684A (en) * | 2010-09-01 | 2012-07-01 | Seereal Technologies Sa | Backplane device |
US20120057007A1 (en) * | 2010-09-03 | 2012-03-08 | Satoshi Ishiguro | Simplified Visual Screening Check on Television |
JP5058316B2 (en) * | 2010-09-03 | 2012-10-24 | 株式会社東芝 | Electronic device, image processing method, and image processing program |
JP5364666B2 (en) * | 2010-09-13 | 2013-12-11 | 株式会社東芝 | Stereoscopic image display apparatus, method and program |
JP5368399B2 (en) * | 2010-09-17 | 2013-12-18 | 富士フイルム株式会社 | Electronic album generating apparatus, stereoscopic image pasting apparatus, operation control method thereof, and program thereof |
EP2432218B1 (en) * | 2010-09-20 | 2016-04-20 | EchoStar Technologies L.L.C. | Methods of displaying an electronic program guide |
US9309556B2 (en) | 2010-09-24 | 2016-04-12 | The Board Of Trustees Of The Leland Stanford Junior University | Direct capture, amplification and sequencing of target DNA using immobilized primers |
EP2629135B1 (en) * | 2010-10-13 | 2015-03-18 | Sharp Kabushiki Kaisha | Display device |
KR20120046937A (en) * | 2010-11-03 | 2012-05-11 | 삼성전자주식회사 | Method and apparatus for providing 3d effect in video device |
US8922658B2 (en) * | 2010-11-05 | 2014-12-30 | Tom Galvin | Network video recorder system |
US9860490B2 (en) | 2010-11-05 | 2018-01-02 | Tom Galvin | Network video recorder system |
US10157526B2 (en) | 2010-11-05 | 2018-12-18 | Razberi Technologies, Inc. | System and method for a security system |
KR101670927B1 (en) * | 2010-11-05 | 2016-11-01 | 삼성전자주식회사 | Display apparatus and method |
US11082665B2 (en) | 2010-11-05 | 2021-08-03 | Razberi Secure Technologies, Llc | System and method for a security system |
US10477158B2 (en) | 2010-11-05 | 2019-11-12 | Razberi Technologies, Inc. | System and method for a security system |
US9218115B2 (en) | 2010-12-02 | 2015-12-22 | Lg Electronics Inc. | Input device and image display apparatus including the same |
US9172943B2 (en) * | 2010-12-07 | 2015-10-27 | At&T Intellectual Property I, L.P. | Dynamic modification of video content at a set-top box device |
KR20120065774A (en) * | 2010-12-13 | 2012-06-21 | 삼성전자주식회사 | Audio providing apparatus, audio receiver and method for providing audio |
KR101734285B1 (en) * | 2010-12-14 | 2017-05-11 | 엘지전자 주식회사 | Video processing apparatus of mobile terminal and method thereof |
US8963694B2 (en) * | 2010-12-17 | 2015-02-24 | Sony Corporation | System and method for remote controlled device selection based on device position data and orientation data of a user |
US20120154559A1 (en) * | 2010-12-21 | 2012-06-21 | Voss Shane D | Generate Media |
US9386294B2 (en) * | 2011-01-05 | 2016-07-05 | Google Technology Holdings LLC | Method and apparatus for 3DTV image adjustment |
US8983555B2 (en) * | 2011-01-07 | 2015-03-17 | Microsoft Technology Licensing, Llc | Wireless communication techniques |
US8643684B2 (en) * | 2011-01-18 | 2014-02-04 | Disney Enterprises, Inc. | Multi-layer plenoptic displays that combine multiple emissive and light modulating planes |
TW201232280A (en) * | 2011-01-20 | 2012-08-01 | Hon Hai Prec Ind Co Ltd | System and method for sharing desktop information |
KR20120088467A (en) * | 2011-01-31 | 2012-08-08 | 삼성전자주식회사 | Method and apparatus for displaying partial 3d image in 2d image disaply area |
JP5632764B2 (en) * | 2011-02-02 | 2014-11-26 | セイコーインスツル株式会社 | Stereoscopic image display device |
US20120202187A1 (en) * | 2011-02-03 | 2012-08-09 | Shadowbox Comics, Llc | Method for distribution and display of sequential graphic art |
US10083639B2 (en) * | 2011-02-04 | 2018-09-25 | Seiko Epson Corporation | Control device for controlling image display device, head-mounted display device, image display system, control method for the image display device, and control method for the head-mounted display device |
US8724467B2 (en) | 2011-02-04 | 2014-05-13 | Cisco Technology, Inc. | System and method for managing congestion in a network environment |
TWI569041B (en) | 2011-02-14 | 2017-02-01 | 半導體能源研究所股份有限公司 | Display device |
US8630247B2 (en) * | 2011-02-15 | 2014-01-14 | Cisco Technology, Inc. | System and method for managing tracking area identity lists in a mobile network environment |
US9035860B2 (en) | 2011-02-16 | 2015-05-19 | Semiconductor Energy Laboratory Co., Ltd. | Display device |
WO2012111427A1 (en) | 2011-02-16 | 2012-08-23 | Semiconductor Energy Laboratory Co., Ltd. | Display device |
US9443455B2 (en) | 2011-02-25 | 2016-09-13 | Semiconductor Energy Laboratory Co., Ltd. | Display device having a plurality of pixels |
KR101852428B1 (en) * | 2011-03-09 | 2018-04-26 | 엘지전자 주식회사 | Mobile twrminal and 3d object control method thereof |
US9558687B2 (en) | 2011-03-11 | 2017-01-31 | Semiconductor Energy Laboratory Co., Ltd. | Display device and method for driving the same |
US9578299B2 (en) * | 2011-03-14 | 2017-02-21 | Qualcomm Incorporated | Stereoscopic conversion for shader based graphics content |
JP5766479B2 (en) * | 2011-03-25 | 2015-08-19 | 京セラ株式会社 | Electronic device, control method, and control program |
JP5730091B2 (en) * | 2011-03-25 | 2015-06-03 | 株式会社ジャパンディスプレイ | Display panel, display device and electronic device |
JP5092033B2 (en) * | 2011-03-28 | 2012-12-05 | 株式会社東芝 | Electronic device, display control method, and display control program |
JP2012205285A (en) * | 2011-03-28 | 2012-10-22 | Sony Corp | Video signal processing apparatus and video signal processing method |
WO2012138539A2 (en) * | 2011-04-08 | 2012-10-11 | The Regents Of The University Of California | Interactive system for collecting, displaying, and ranking items based on quantitative and textual input from multiple participants |
US8988512B2 (en) * | 2011-04-14 | 2015-03-24 | Mediatek Inc. | Method for adjusting playback of multimedia content according to detection result of user status and related apparatus thereof |
JP5162000B2 (en) * | 2011-04-19 | 2013-03-13 | 株式会社東芝 | Information processing apparatus, information processing method, and program |
JP5161998B2 (en) * | 2011-04-19 | 2013-03-13 | 株式会社東芝 | Information processing apparatus, information processing method, and program |
JP5161999B2 (en) * | 2011-04-19 | 2013-03-13 | 株式会社東芝 | Electronic device, display control method, and display control program |
CN103444187A (en) * | 2011-05-05 | 2013-12-11 | 英派尔科技开发有限公司 | Lenticular directional display |
US20120287115A1 (en) * | 2011-05-10 | 2012-11-15 | Ding Junjie | Method for generating image frames |
KR20120126458A (en) * | 2011-05-11 | 2012-11-21 | 엘지전자 주식회사 | Method for processing broadcasting signal and display device thereof |
WO2012156778A1 (en) * | 2011-05-13 | 2012-11-22 | Sony Ericsson Mobile Communications Ab | Adjusting parallax barriers |
US8913104B2 (en) * | 2011-05-24 | 2014-12-16 | Bose Corporation | Audio synchronization for two dimensional and three dimensional video signals |
US9420259B2 (en) * | 2011-05-24 | 2016-08-16 | Comcast Cable Communications, Llc | Dynamic distribution of three-dimensional content |
JP6050941B2 (en) * | 2011-05-26 | 2016-12-21 | サターン ライセンシング エルエルシーSaturn Licensing LLC | Display device and method, and program |
US9442562B2 (en) * | 2011-05-27 | 2016-09-13 | Dolby Laboratories Licensing Corporation | Systems and methods of image processing that adjust for viewer position, screen size and viewing distance |
US9084068B2 (en) * | 2011-05-30 | 2015-07-14 | Sony Corporation | Sensor-based placement of sound in video recording |
CN103262551B (en) * | 2011-06-01 | 2015-12-09 | 松下电器产业株式会社 | Image processor, dispensing device, image processing system, image treatment method, sending method and integrated circuit |
JP2012253543A (en) * | 2011-06-02 | 2012-12-20 | Seiko Epson Corp | Display device, control method of display device, and program |
JP5770018B2 (en) * | 2011-06-03 | 2015-08-26 | 任天堂株式会社 | Display control program, display control apparatus, display control method, and display control system |
US9420268B2 (en) | 2011-06-23 | 2016-08-16 | Lg Electronics Inc. | Apparatus and method for displaying 3-dimensional image |
WO2012174739A1 (en) * | 2011-06-24 | 2012-12-27 | Technicolor (China) Technology Co., Ltd. | Method and device for delivering 3d content |
US9030522B2 (en) | 2011-06-24 | 2015-05-12 | At&T Intellectual Property I, Lp | Apparatus and method for providing media content |
US9445046B2 (en) | 2011-06-24 | 2016-09-13 | At&T Intellectual Property I, L.P. | Apparatus and method for presenting media content with telepresence |
US9602766B2 (en) | 2011-06-24 | 2017-03-21 | At&T Intellectual Property I, L.P. | Apparatus and method for presenting three dimensional objects with telepresence |
KR101772458B1 (en) * | 2011-06-28 | 2017-08-30 | 엘지전자 주식회사 | Display device and method for controlling thereof |
US20130265300A1 (en) * | 2011-07-03 | 2013-10-10 | Neorai Vardi | Computer device in form of wearable glasses and user interface thereof |
JP2013015779A (en) * | 2011-07-06 | 2013-01-24 | Sony Corp | Display control device, display control method, and computer program |
US8988411B2 (en) | 2011-07-08 | 2015-03-24 | Semiconductor Energy Laboratory Co., Ltd. | Display device |
US9137522B2 (en) * | 2011-07-11 | 2015-09-15 | Realtek Semiconductor Corp. | Device and method for 3-D display control |
US9294752B2 (en) * | 2011-07-13 | 2016-03-22 | Google Technology Holdings LLC | Dual mode user interface system and method for 3D video |
US8587635B2 (en) | 2011-07-15 | 2013-11-19 | At&T Intellectual Property I, L.P. | Apparatus and method for providing media services with telepresence |
US8928708B2 (en) | 2011-07-15 | 2015-01-06 | Semiconductor Energy Laboratory Co., Ltd. | Display device and method for driving the display device |
KR101926477B1 (en) * | 2011-07-18 | 2018-12-11 | 삼성전자 주식회사 | Contents play method and apparatus |
KR20130010834A (en) * | 2011-07-19 | 2013-01-29 | 가부시키가이샤 한도오따이 에네루기 켄큐쇼 | Display device |
JP2013038504A (en) | 2011-08-04 | 2013-02-21 | Sony Corp | Imaging device, image processing method and program |
JP5815326B2 (en) * | 2011-08-12 | 2015-11-17 | ルネサスエレクトロニクス株式会社 | Video decoding device and image display device |
WO2013026048A2 (en) * | 2011-08-18 | 2013-02-21 | Utherverse Digital, Inc. | Systems and methods of virtual world interaction |
EP2747641A4 (en) | 2011-08-26 | 2015-04-01 | Kineticor Inc | Methods, systems, and devices for intra-scan motion correction |
JP2013050537A (en) * | 2011-08-30 | 2013-03-14 | Sony Corp | Display device and electronic apparatus |
JP2013050539A (en) * | 2011-08-30 | 2013-03-14 | Sony Corp | Display device and electronic apparatus |
US20130050596A1 (en) * | 2011-08-30 | 2013-02-28 | Industrial Technology Research Institute | Auto-stereoscopic display and method for fabricating the same |
JP2013050538A (en) | 2011-08-30 | 2013-03-14 | Sony Corp | Display device and electronic apparatus |
KR102008818B1 (en) * | 2011-08-31 | 2019-08-08 | 엘지전자 주식회사 | Digital broadcast signal processing method and device |
US8872813B2 (en) | 2011-09-02 | 2014-10-28 | Adobe Systems Incorporated | Parallax image authoring and viewing in digital media |
CN102368244B (en) * | 2011-09-08 | 2013-05-15 | 广州市动景计算机科技有限公司 | Page content alignment method, device and mobile terminal browser |
DE112012003931T5 (en) | 2011-09-21 | 2014-07-10 | Magna Electronics, Inc. | Image processing system for a motor vehicle with image data transmission and power supply via a coaxial cable |
CN102510503B (en) * | 2011-09-30 | 2015-06-03 | 深圳超多维光电子有限公司 | Stereoscopic display method and stereoscopic display equipment |
JP5715539B2 (en) * | 2011-10-06 | 2015-05-07 | 株式会社ジャパンディスプレイ | Display device and electronic device |
KR20130037861A (en) * | 2011-10-07 | 2013-04-17 | 삼성디스플레이 주식회사 | Display apparatus and method of displaying three dimensional image using the same |
KR101813035B1 (en) | 2011-10-10 | 2017-12-28 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
WO2013055164A1 (en) * | 2011-10-13 | 2013-04-18 | 삼성전자 주식회사 | Method for displaying contents, method for synchronizing contents, and method and device for displaying broadcast contents |
GB2495725B (en) * | 2011-10-18 | 2014-10-01 | Sony Comp Entertainment Europe | Image transfer apparatus and method |
JP5149435B1 (en) * | 2011-11-04 | 2013-02-20 | 株式会社東芝 | Video processing apparatus and video processing method |
US8933935B2 (en) | 2011-11-10 | 2015-01-13 | 7D Surgical Inc. | Method of rendering and manipulating anatomical images on mobile computing device |
KR101887058B1 (en) * | 2011-11-11 | 2018-08-09 | 엘지전자 주식회사 | A process for processing a three-dimensional image and a method for controlling electric power of the same |
US20140327708A1 (en) * | 2011-11-15 | 2014-11-06 | Sharp Kabushiki Kaisha | Display device |
US20130127841A1 (en) * | 2011-11-18 | 2013-05-23 | Samsung Electronics Co., Ltd. | Three-dimensional (3d) image display method and apparatus for 3d imaging and displaying contents according to start or end of operation |
US9942580B2 (en) | 2011-11-18 | 2018-04-10 | At&T Intellecutal Property I, L.P. | System and method for automatically selecting encoding/decoding for streaming media |
US8660362B2 (en) * | 2011-11-21 | 2014-02-25 | Microsoft Corporation | Combined depth filtering and super resolution |
WO2013081985A1 (en) | 2011-11-28 | 2013-06-06 | Magna Electronics, Inc. | Vision system for vehicle |
DE102011055967B4 (en) * | 2011-12-02 | 2016-03-10 | Seereal Technologies S.A. | Measuring method and device for carrying out the measuring method |
US9626798B2 (en) | 2011-12-05 | 2017-04-18 | At&T Intellectual Property I, L.P. | System and method to digitally replace objects in images or video |
CN103163650A (en) * | 2011-12-08 | 2013-06-19 | 武汉天马微电子有限公司 | Naked eye three-dimensional (3D) grating structure |
US20130156090A1 (en) * | 2011-12-14 | 2013-06-20 | Ati Technologies Ulc | Method and apparatus for enabling multiuser use |
US9042266B2 (en) * | 2011-12-21 | 2015-05-26 | Kik Interactive, Inc. | Methods and apparatus for initializing a network connection for an output device |
US20140317537A1 (en) * | 2011-12-22 | 2014-10-23 | Tencent Technology (Shenzhen) Company Limited | Browser based application program extension method and device |
EP2611176A3 (en) * | 2011-12-29 | 2015-11-18 | Samsung Electronics Co., Ltd. | Display apparatus and controlling method thereof |
CN202995143U (en) * | 2011-12-29 | 2013-06-12 | 三星电子株式会社 | Glasses device and display device |
US9392251B2 (en) | 2011-12-29 | 2016-07-12 | Samsung Electronics Co., Ltd. | Display apparatus, glasses apparatus and method for controlling depth |
TWI467235B (en) * | 2012-02-06 | 2015-01-01 | Innocom Tech Shenzhen Co Ltd | Three-dimensional (3d) display and displaying method thereof |
US9324190B2 (en) | 2012-02-24 | 2016-04-26 | Matterport, Inc. | Capturing and aligning three-dimensional scenes |
US11282287B2 (en) | 2012-02-24 | 2022-03-22 | Matterport, Inc. | Employing three-dimensional (3D) data predicted from two-dimensional (2D) images using neural networks for 3D modeling applications and other applications |
US10848731B2 (en) | 2012-02-24 | 2020-11-24 | Matterport, Inc. | Capturing and aligning panoramic image and depth data |
CN103294453B (en) * | 2012-02-24 | 2017-02-22 | 华为技术有限公司 | Image processing method and image processing device |
KR20130098023A (en) * | 2012-02-27 | 2013-09-04 | 한국전자통신연구원 | Apparatus and method for displaying an image on 3-dimentional display based on multi-layer parallax barrier |
JP5942477B2 (en) | 2012-02-29 | 2016-06-29 | 富士ゼロックス株式会社 | Setting device and program |
EP2637416A1 (en) * | 2012-03-06 | 2013-09-11 | Alcatel Lucent | A system and method for optimized streaming of variable multi-viewpoint media |
JP6015743B2 (en) * | 2012-03-07 | 2016-10-26 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
JP5762998B2 (en) * | 2012-03-07 | 2015-08-12 | 株式会社ジャパンディスプレイ | Display device and electronic device |
JP5806150B2 (en) * | 2012-03-13 | 2015-11-10 | 株式会社ジャパンディスプレイ | Display device |
JP5779124B2 (en) * | 2012-03-13 | 2015-09-16 | 株式会社ジャパンディスプレイ | Display device and electronic device |
US9280042B2 (en) * | 2012-03-16 | 2016-03-08 | City University Of Hong Kong | Automatic switching of a multi-mode projector display screen for displaying three-dimensional and two-dimensional images |
CN102650741B (en) * | 2012-03-16 | 2014-06-11 | 京东方科技集团股份有限公司 | Light splitting device, manufacturing method thereof and 3D (Three-Dimensional) display device |
WO2013135203A1 (en) | 2012-03-16 | 2013-09-19 | Tencent Technology (Shenzhen) Company Limited | Offline download method and system |
US9733707B2 (en) | 2012-03-22 | 2017-08-15 | Honeywell International Inc. | Touch screen display user interface and method for improving touch interface utility on the same employing a rules-based masking system |
US20130265297A1 (en) * | 2012-04-06 | 2013-10-10 | Motorola Mobility, Inc. | Display of a Corrected Browser Projection of a Visual Guide for Placing a Three Dimensional Object in a Browser |
US9308439B2 (en) * | 2012-04-10 | 2016-04-12 | Bally Gaming, Inc. | Controlling three-dimensional presentation of wagering game content |
WO2013153418A1 (en) * | 2012-04-12 | 2013-10-17 | Sony Mobile Communications Ab | Improved 3d image display system |
CN102645959A (en) * | 2012-04-16 | 2012-08-22 | 上海颖杰计算机系统设备有限公司 | 3D (Three Dimensional) integrated computer |
KR101923150B1 (en) * | 2012-04-16 | 2018-11-29 | 삼성디스플레이 주식회사 | Display apparatus and method of displaying three dimensional image using the same |
US20150062315A1 (en) * | 2012-04-18 | 2015-03-05 | The Regents Of The University Of California | Simultaneous 2d and 3d images on a display |
EP2653906B1 (en) | 2012-04-20 | 2022-08-24 | Dolby Laboratories Licensing Corporation | A system for delivering stereoscopic images |
CN103379362B (en) * | 2012-04-24 | 2017-07-07 | 腾讯科技(深圳)有限公司 | VOD method and system |
US9201495B2 (en) * | 2012-04-24 | 2015-12-01 | Mobitv, Inc. | Control of perspective in multi-dimensional media |
US9707892B2 (en) * | 2012-04-25 | 2017-07-18 | Gentex Corporation | Multi-focus optical system |
US20130290867A1 (en) * | 2012-04-27 | 2013-10-31 | Litera Technologies, LLC | Systems and Methods For Providing Dynamic and Interactive Viewing and Control of Applications |
KR20130123599A (en) * | 2012-05-03 | 2013-11-13 | 한국과학기술원 | Speed dependent automatic dimming technique |
CN103457960B (en) | 2012-05-15 | 2018-03-09 | 腾讯科技(深圳)有限公司 | The method and system of load document in web game |
US10089537B2 (en) | 2012-05-18 | 2018-10-02 | Magna Electronics Inc. | Vehicle vision system with front and rear camera integration |
US9201270B2 (en) * | 2012-06-01 | 2015-12-01 | Leia Inc. | Directional backlight with a modulation layer |
JP2015525370A (en) * | 2012-06-01 | 2015-09-03 | コーニンクレッカ フィリップス エヌ ヴェ | Autostereoscopic display device and driving method |
US8570651B1 (en) * | 2012-06-04 | 2013-10-29 | Hae-Yong Choi | Both side screen for combined use of 2D/3D images |
US9418672B2 (en) | 2012-06-05 | 2016-08-16 | Apple Inc. | Navigation application with adaptive instruction text |
US9997069B2 (en) | 2012-06-05 | 2018-06-12 | Apple Inc. | Context-aware voice guidance |
US9886794B2 (en) | 2012-06-05 | 2018-02-06 | Apple Inc. | Problem reporting in maps |
US9159153B2 (en) | 2012-06-05 | 2015-10-13 | Apple Inc. | Method, system and apparatus for providing visual feedback of a map view change |
US9230556B2 (en) | 2012-06-05 | 2016-01-05 | Apple Inc. | Voice instructions during navigation |
US8965696B2 (en) | 2012-06-05 | 2015-02-24 | Apple Inc. | Providing navigation instructions while operating navigation application in background |
US9482296B2 (en) | 2012-06-05 | 2016-11-01 | Apple Inc. | Rendering road signs during navigation |
US9111380B2 (en) | 2012-06-05 | 2015-08-18 | Apple Inc. | Rendering maps |
US10176633B2 (en) | 2012-06-05 | 2019-01-08 | Apple Inc. | Integrated mapping and navigation application |
US9367959B2 (en) * | 2012-06-05 | 2016-06-14 | Apple Inc. | Mapping application with 3D presentation |
JP6046923B2 (en) * | 2012-06-07 | 2016-12-21 | キヤノン株式会社 | Image coding apparatus, image coding method, and program |
WO2013183801A1 (en) * | 2012-06-08 | 2013-12-12 | Lg Electronics Inc. | Rendering method of 3d web-page and terminal using the same |
US9800862B2 (en) * | 2012-06-12 | 2017-10-24 | The Board Of Trustees Of The University Of Illinois | System and methods for visualizing information |
US9829996B2 (en) * | 2012-06-25 | 2017-11-28 | Zspace, Inc. | Operations in a three dimensional display system |
WO2014000129A1 (en) * | 2012-06-30 | 2014-01-03 | Intel Corporation | 3d graphical user interface |
KR101649660B1 (en) * | 2012-07-06 | 2016-08-19 | 엘지전자 주식회사 | Terminal for increasing visual comfort sensation of 3d object and control method thereof |
US20140022241A1 (en) * | 2012-07-18 | 2014-01-23 | Electronics And Telecommunications Research Institute | Display apparatus and method based on symmetrically spb |
US10353718B2 (en) * | 2012-07-23 | 2019-07-16 | Vmware, Inc. | Providing access to a remote application via a web client |
US8959176B2 (en) | 2012-07-31 | 2015-02-17 | Apple Inc. | Streaming common media content to multiple devices |
US9491784B2 (en) * | 2012-07-31 | 2016-11-08 | Apple Inc. | Streaming common media content to multiple devices |
US9786281B1 (en) * | 2012-08-02 | 2017-10-10 | Amazon Technologies, Inc. | Household agent learning |
KR101310941B1 (en) * | 2012-08-03 | 2013-09-23 | 삼성전자주식회사 | Display apparatus for displaying a plurality of content views, shutter glasses device for syncronizing with one of the content views and methods thereof |
US9423871B2 (en) * | 2012-08-07 | 2016-08-23 | Honeywell International Inc. | System and method for reducing the effects of inadvertent touch on a touch screen controller |
US9225972B2 (en) | 2012-08-10 | 2015-12-29 | Pixtronix, Inc. | Three dimensional (3D) image generation using electromechanical display elements |
US9198209B2 (en) | 2012-08-21 | 2015-11-24 | Cisco Technology, Inc. | Providing integrated end-to-end architecture that includes quality of service transport for tunneled traffic |
CN103631021B (en) * | 2012-08-27 | 2016-06-15 | 群康科技(深圳)有限公司 | 3 d display device and image display method thereof |
TWI509289B (en) * | 2012-08-27 | 2015-11-21 | Innocom Tech Shenzhen Co Ltd | Stereoscopic display apparatus and image display method thereof |
KR20140028780A (en) | 2012-08-30 | 2014-03-10 | 삼성디스플레이 주식회사 | Display apparatus and method of displaying three dimensional image using the same |
US9811878B1 (en) * | 2012-09-04 | 2017-11-07 | Amazon Technologies, Inc. | Dynamic processing of image borders |
US10171540B2 (en) * | 2012-09-07 | 2019-01-01 | High Sec Labs Ltd | Method and apparatus for streaming video security |
US20150138444A1 (en) * | 2012-09-14 | 2015-05-21 | Masayuki Hirabayashi | Video display apparatus and terminal device |
US9179232B2 (en) * | 2012-09-17 | 2015-11-03 | Nokia Technologies Oy | Method and apparatus for associating audio objects with content and geo-location |
JP5837009B2 (en) * | 2012-09-26 | 2015-12-24 | キヤノン株式会社 | Display device and control method thereof |
CN104104934B (en) * | 2012-10-04 | 2019-02-19 | 陈笛 | The component and method of the more spectators' Three-dimensional Displays of glasses-free |
JP5928286B2 (en) * | 2012-10-05 | 2016-06-01 | 富士ゼロックス株式会社 | Information processing apparatus and program |
WO2014163665A1 (en) * | 2012-10-10 | 2014-10-09 | Kassouf Sidney | System for distributing auto-stereoscopic images |
US20140104242A1 (en) * | 2012-10-12 | 2014-04-17 | Nvidia Corporation | System and method for concurrent display of a video signal on a plurality of display devices |
US9235103B2 (en) * | 2012-10-25 | 2016-01-12 | Au Optronics Corporation | 3D liquid crystal display comprising four electrodes alternately arrange between a first and second substrate |
CN102917265A (en) * | 2012-10-25 | 2013-02-06 | 深圳创维-Rgb电子有限公司 | Information browsing method and system based on network television |
US9161018B2 (en) * | 2012-10-26 | 2015-10-13 | Christopher L. UHL | Methods and systems for synthesizing stereoscopic images |
TWI452345B (en) * | 2012-10-26 | 2014-09-11 | Au Optronics Corp | Three dimensions display device and displaying method thereof |
JP2014092744A (en) * | 2012-11-06 | 2014-05-19 | Japan Display Inc | Stereoscopic display device |
CN104516168B (en) * | 2012-11-21 | 2018-05-08 | 京东方科技集团股份有限公司 | Convertible lens and preparation method thereof, 2 d-3 d display base plate and display device |
US9674510B2 (en) * | 2012-11-21 | 2017-06-06 | Elwha Llc | Pulsed projection system for 3D video |
CN102981343B (en) * | 2012-11-21 | 2015-01-07 | 京东方科技集团股份有限公司 | Convertible lens and preparation method thereof, as well as two-dimensional and three-dimensional display surface substrate and display device |
US9547937B2 (en) * | 2012-11-30 | 2017-01-17 | Legend3D, Inc. | Three-dimensional annotation system and method |
WO2014085910A1 (en) | 2012-12-04 | 2014-06-12 | Interaxon Inc. | System and method for enhancing content using brain-state data |
US9265458B2 (en) | 2012-12-04 | 2016-02-23 | Sync-Think, Inc. | Application of smooth pursuit cognitive testing paradigms to clinical drug development |
US9128580B2 (en) | 2012-12-07 | 2015-09-08 | Honeywell International Inc. | System and method for interacting with a touch screen interface utilizing an intelligent stencil mask |
US20140165209A1 (en) * | 2012-12-11 | 2014-06-12 | Verizon Patent And Licensing Inc. | Digital content delivery platform for multiple retailers |
US9047054B1 (en) | 2012-12-20 | 2015-06-02 | Audible, Inc. | User location-based management of content presentation |
US9497448B2 (en) * | 2012-12-31 | 2016-11-15 | Lg Display Co., Ltd. | Image processing method of transparent display apparatus and apparatus thereof |
TWI531213B (en) * | 2013-01-18 | 2016-04-21 | 國立成功大學 | Image conversion method and module for naked-eye 3d display |
US10327708B2 (en) | 2013-01-24 | 2019-06-25 | Kineticor, Inc. | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
US9717461B2 (en) | 2013-01-24 | 2017-08-01 | Kineticor, Inc. | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
US9305365B2 (en) | 2013-01-24 | 2016-04-05 | Kineticor, Inc. | Systems, devices, and methods for tracking moving targets |
EP2950714A4 (en) | 2013-02-01 | 2017-08-16 | Kineticor, Inc. | Motion tracking system for real time adaptive motion compensation in biomedical imaging |
WO2014129134A1 (en) * | 2013-02-19 | 2014-08-28 | パナソニック株式会社 | Image display device |
TWI502247B (en) * | 2013-02-26 | 2015-10-01 | Chunghwa Picture Tubes Ltd | Autostereoscopic display device and display method thereof |
US8712217B1 (en) | 2013-03-01 | 2014-04-29 | Comcast Cable Communications, Llc | Methods and systems for time-shifting content |
US9380976B2 (en) | 2013-03-11 | 2016-07-05 | Sync-Think, Inc. | Optical neuroinformatics |
US20140267601A1 (en) * | 2013-03-14 | 2014-09-18 | Corel Corporation | System and method for efficient editing of 3d video |
US20140268324A1 (en) * | 2013-03-18 | 2014-09-18 | 3-D Virtual Lens Technologies, Llc | Method of displaying 3d images from 2d source images using a barrier grid |
CN103236074B (en) * | 2013-03-25 | 2015-12-23 | 深圳超多维光电子有限公司 | A kind of 2D/3D image processing method and device |
US10110647B2 (en) * | 2013-03-28 | 2018-10-23 | Qualcomm Incorporated | Method and apparatus for altering bandwidth consumption |
KR101981530B1 (en) | 2013-03-29 | 2019-05-23 | 엘지디스플레이 주식회사 | Stereoscopic image display device and method for driving the same |
CN103235415B (en) * | 2013-04-01 | 2015-12-23 | 昆山龙腾光电有限公司 | Based on the multi-view free stereoscopic displayer of grating |
KR101970577B1 (en) * | 2013-04-09 | 2019-04-19 | 엘지디스플레이 주식회사 | Stereoscopic display device and eye-tracking method thereof |
US20140316907A1 (en) * | 2013-04-17 | 2014-10-23 | Asaf NAIM | Multilayered user interface for internet browser |
US20140328505A1 (en) * | 2013-05-02 | 2014-11-06 | Microsoft Corporation | Sound field adaptation based upon user tracking |
CN103293689B (en) * | 2013-05-31 | 2015-05-13 | 京东方科技集团股份有限公司 | Method capable of switching between different display modes and display device |
KR20140142863A (en) * | 2013-06-05 | 2014-12-15 | 한국전자통신연구원 | Apparatus and method for providing graphic editors |
TWI510813B (en) * | 2013-06-18 | 2015-12-01 | Zhangjiagang Kangde Xin Optronics Material Co Ltd | A liquid crystal parallax barrier device that displays three-dimensional images in both directions |
CN104238185B (en) * | 2013-06-19 | 2017-04-12 | 扬升照明股份有限公司 | Light source module, display device and light source module drive method |
CN103309639A (en) * | 2013-06-21 | 2013-09-18 | 广东威创视讯科技股份有限公司 | Method and device based on split screen display of three-dimensional scene |
US10003789B2 (en) | 2013-06-24 | 2018-06-19 | The Regents Of The University Of California | Practical two-frame 3D+2D TV |
CN103365657B (en) * | 2013-06-28 | 2019-03-15 | 北京智谷睿拓技术服务有限公司 | Display control method, device and the display equipment including the device |
TWI495904B (en) * | 2013-07-12 | 2015-08-11 | Vision Technology Co Ltd C | Field sequential color lcd and method for generating 3d images by matching a software optical grating |
US9418469B1 (en) | 2013-07-19 | 2016-08-16 | Outward, Inc. | Generating video content |
JP2015025968A (en) * | 2013-07-26 | 2015-02-05 | ソニー株式会社 | Presentation medium and display device |
US9678929B2 (en) * | 2013-08-01 | 2017-06-13 | Equldo Limited | Stereoscopic online web content creation and rendering |
TWI489148B (en) * | 2013-08-23 | 2015-06-21 | Au Optronics Corp | Stereoscopic display and the driving method |
TWI505243B (en) * | 2013-09-10 | 2015-10-21 | Zhangjiagang Kangde Xin Optronics Material Co Ltd | A device that can display 2D and 3D images at the same time |
KR101856568B1 (en) * | 2013-09-16 | 2018-06-19 | 삼성전자주식회사 | Multi view image display apparatus and controlling method thereof |
US10592064B2 (en) * | 2013-09-17 | 2020-03-17 | Amazon Technologies, Inc. | Approaches for three-dimensional object display used in content navigation |
US10067634B2 (en) | 2013-09-17 | 2018-09-04 | Amazon Technologies, Inc. | Approaches for three-dimensional object display |
US9392355B1 (en) * | 2013-09-19 | 2016-07-12 | Voyetra Turtle Beach, Inc. | Gaming headset with voice scrambling for private in-game conversations |
CN103508999B (en) * | 2013-10-12 | 2015-05-13 | 浙江海正药业股份有限公司 | Maxacalcitol synthesizing intermediate and preparation method and application thereof |
US11343487B2 (en) | 2013-10-31 | 2022-05-24 | David Woods | Trackable glasses system for perspective views of a display |
US10116914B2 (en) * | 2013-10-31 | 2018-10-30 | 3Di Llc | Stereoscopic display |
US10652525B2 (en) | 2013-10-31 | 2020-05-12 | 3Di Llc | Quad view display system |
US9986228B2 (en) | 2016-03-24 | 2018-05-29 | 3Di Llc | Trackable glasses system that provides multiple views of a shared display |
US9883173B2 (en) | 2013-12-25 | 2018-01-30 | 3Di Llc | Stereoscopic display |
JP6411862B2 (en) | 2013-11-15 | 2018-10-24 | パナソニック株式会社 | File generation method and file generation apparatus |
KR20150057064A (en) * | 2013-11-18 | 2015-05-28 | 엘지전자 주식회사 | Electronic device and control method thereof |
US20150138184A1 (en) * | 2013-11-20 | 2015-05-21 | Apple Inc. | Spatially interactive computing device |
CN103605211B (en) * | 2013-11-27 | 2016-04-20 | 南京大学 | Tablet non-auxiliary stereo display device and method |
TWI511112B (en) * | 2013-11-27 | 2015-12-01 | Acer Inc | Image display method and display system |
KR20150065056A (en) * | 2013-12-04 | 2015-06-12 | 삼성디스플레이 주식회사 | Image display apparatus |
US9988047B2 (en) | 2013-12-12 | 2018-06-05 | Magna Electronics Inc. | Vehicle control system with traffic driving control |
US20150189256A1 (en) * | 2013-12-16 | 2015-07-02 | Christian Stroetmann | Autostereoscopic multi-layer display and control approaches |
CN103676302B (en) * | 2013-12-31 | 2016-04-06 | 京东方科技集团股份有限公司 | Realize array base palte, display device and method that 2D/3D display switches |
US10303242B2 (en) | 2014-01-06 | 2019-05-28 | Avegant Corp. | Media chair apparatus, system, and method |
US10409079B2 (en) | 2014-01-06 | 2019-09-10 | Avegant Corp. | Apparatus, system, and method for displaying an image using a plate |
JP6467680B2 (en) * | 2014-01-10 | 2019-02-13 | パナソニックIpマネジメント株式会社 | File generation method and file generation apparatus |
US9785623B2 (en) * | 2014-01-22 | 2017-10-10 | Freedom Scientific, Inc. | Identifying a set of related visible content elements in a markup language document |
WO2015112064A1 (en) * | 2014-01-23 | 2015-07-30 | Telefonaktiebolaget L M Ericsson (Publ) | Multi-view display control for channel selection |
US9182605B2 (en) * | 2014-01-29 | 2015-11-10 | Emine Goulanian | Front-projection autostereoscopic 3D display system |
US10554962B2 (en) | 2014-02-07 | 2020-02-04 | Samsung Electronics Co., Ltd. | Multi-layer high transparency display for light field generation |
US10565925B2 (en) | 2014-02-07 | 2020-02-18 | Samsung Electronics Co., Ltd. | Full color display with intrinsic transparency |
US10453371B2 (en) | 2014-02-07 | 2019-10-22 | Samsung Electronics Co., Ltd. | Multi-layer display with color and contrast enhancement |
US10375365B2 (en) | 2014-02-07 | 2019-08-06 | Samsung Electronics Co., Ltd. | Projection system with enhanced color and contrast |
CN103792672B (en) * | 2014-02-14 | 2016-03-23 | 成都京东方光电科技有限公司 | Stereo display assembly, liquid crystal panel and display device |
CN104853008B (en) * | 2014-02-17 | 2020-05-19 | 北京三星通信技术研究有限公司 | Portable device and method capable of switching between two-dimensional display and three-dimensional display |
KR101678389B1 (en) * | 2014-02-28 | 2016-11-22 | 엔트릭스 주식회사 | Method for providing media data based on cloud computing, apparatus and system |
US20150253974A1 (en) | 2014-03-07 | 2015-09-10 | Sony Corporation | Control of large screen display using wireless portable computer interfacing with display controller |
CN103903548B (en) * | 2014-03-07 | 2016-03-02 | 京东方科技集团股份有限公司 | A kind of driving method of display panel and drive system |
CN106572810A (en) | 2014-03-24 | 2017-04-19 | 凯内蒂科尔股份有限公司 | Systems, methods, and devices for removing prospective motion correction from medical imaging scans |
US9373306B2 (en) * | 2014-03-25 | 2016-06-21 | Intel Coporation | Direct viewer projection |
KR102175813B1 (en) * | 2014-04-18 | 2020-11-09 | 삼성디스플레이 주식회사 | Three dimensional image display device and method of processing image |
US20150334367A1 (en) * | 2014-05-13 | 2015-11-19 | Nagravision S.A. | Techniques for displaying three dimensional objects |
KR102204830B1 (en) * | 2014-05-20 | 2021-01-19 | 한국전자통신연구원 | Method and apparatus for providing three-dimensional territorial brordcasting based on non real time service |
US9838756B2 (en) * | 2014-05-20 | 2017-12-05 | Electronics And Telecommunications Research Institute | Method and apparatus for providing three-dimensional territorial broadcasting based on non real time service |
CN104023223B (en) * | 2014-05-29 | 2016-03-02 | 京东方科技集团股份有限公司 | Display control method, Apparatus and system |
CN104090365A (en) * | 2014-06-18 | 2014-10-08 | 京东方科技集团股份有限公司 | Shutter glasses, display device, display system and display method |
US10613585B2 (en) * | 2014-06-19 | 2020-04-07 | Samsung Electronics Co., Ltd. | Transparent display apparatus, group play system using transparent display apparatus and performance methods thereof |
GB2527548A (en) * | 2014-06-25 | 2015-12-30 | Sharp Kk | Variable barrier pitch correction |
KR102221676B1 (en) * | 2014-07-02 | 2021-03-02 | 삼성전자주식회사 | Method, User terminal and Audio System for the speaker location and level control using the magnetic field |
CN104155769A (en) * | 2014-07-15 | 2014-11-19 | 深圳市亿思达显示科技有限公司 | 2D/3D co-fusion display device and advertizing device |
CN104090818A (en) * | 2014-07-16 | 2014-10-08 | 北京智谷睿拓技术服务有限公司 | Information processing method, device and system |
TWI556624B (en) * | 2014-07-18 | 2016-11-01 | 友達光電股份有限公司 | Image displaying method and image dispaly device |
CN104252058B (en) * | 2014-07-18 | 2017-06-20 | 京东方科技集团股份有限公司 | Grating control method and device, grating, display panel and 3D display devices |
CN106714681A (en) | 2014-07-23 | 2017-05-24 | 凯内蒂科尔股份有限公司 | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
EP3175773A4 (en) * | 2014-07-30 | 2018-10-10 | Olympus Corporation | Image processing device |
KR102366677B1 (en) * | 2014-08-02 | 2022-02-23 | 삼성전자주식회사 | Apparatus and Method for User Interaction thereof |
WO2016021861A1 (en) * | 2014-08-02 | 2016-02-11 | Samsung Electronics Co., Ltd. | Electronic device and user interaction method thereof |
CN105323654B (en) * | 2014-08-05 | 2019-02-15 | 优视科技有限公司 | The method and apparatus for carrying out the content-data of automatic network is presented |
JP6327062B2 (en) * | 2014-08-25 | 2018-05-23 | オムロン株式会社 | Display device |
US9925980B2 (en) | 2014-09-17 | 2018-03-27 | Magna Electronics Inc. | Vehicle collision avoidance system with enhanced pedestrian avoidance |
US11205305B2 (en) | 2014-09-22 | 2021-12-21 | Samsung Electronics Company, Ltd. | Presentation of three-dimensional video |
US10313656B2 (en) | 2014-09-22 | 2019-06-04 | Samsung Electronics Company Ltd. | Image stitching for three-dimensional video |
WO2016046068A1 (en) | 2014-09-25 | 2016-03-31 | Koninklijke Philips N.V. | Display device with directional control of the output, and a backlight for such a display device |
FR3026589A1 (en) * | 2014-09-30 | 2016-04-01 | Orange | METHOD AND DEVICE FOR ADAPTING THE DISPLAY OF A VIDEO STREAM BY A CLIENT |
FR3026852B1 (en) * | 2014-10-03 | 2016-12-02 | Thales Sa | SEMI-TRANSPARENT SCREEN DISPLAY SYSTEM SHARED BY TWO OBSERVERS |
US10506295B2 (en) * | 2014-10-09 | 2019-12-10 | Disney Enterprises, Inc. | Systems and methods for delivering secondary content to viewers |
KR102266064B1 (en) * | 2014-10-15 | 2021-06-18 | 삼성디스플레이 주식회사 | Method of driving display panel, display panel driving apparatus and display apparatus having the display panel driving apparatus |
US20160119685A1 (en) * | 2014-10-21 | 2016-04-28 | Samsung Electronics Co., Ltd. | Display method and display device |
CN104361622B (en) * | 2014-10-31 | 2018-06-19 | 福建星网视易信息系统有限公司 | A kind of interface method for drafting and device |
DE102014225796A1 (en) * | 2014-12-15 | 2016-06-16 | Bayerische Motoren Werke Aktiengesellschaft | Method for controlling a vehicle system |
CN104461440B (en) * | 2014-12-31 | 2018-01-02 | 上海天马有机发光显示技术有限公司 | Rendering intent, rendering device and display device |
EP3243094B1 (en) | 2015-01-10 | 2022-03-23 | LEIA Inc. | Multibeam grating-based backlight and a method of electronic display operation |
KR102322340B1 (en) | 2015-01-10 | 2021-11-05 | 레이아 인코포레이티드 | Diffraction grating-based backlighting having controlled diffractive coupling efficiency |
JP6567058B2 (en) | 2015-01-10 | 2019-08-28 | レイア、インコーポレイテッドLeia Inc. | 2D / 3D (2D / 3D) switchable display backlight and electronic display |
EP3248058B1 (en) | 2015-01-19 | 2020-05-06 | LEIA Inc. | Unidirectional grating-based backlighting employing a reflective island |
KR20160089600A (en) * | 2015-01-19 | 2016-07-28 | 삼성디스플레이 주식회사 | Display device |
US9690110B2 (en) * | 2015-01-21 | 2017-06-27 | Apple Inc. | Fine-coarse autostereoscopic display |
CN107209393B (en) * | 2015-01-28 | 2022-02-08 | 镭亚股份有限公司 | Three-dimensional (3D) electronic display |
US9973725B2 (en) * | 2015-02-02 | 2018-05-15 | Continental Teves Ag & Co. Ohg | Modular television system |
JP6359989B2 (en) * | 2015-02-24 | 2018-07-18 | 株式会社ジャパンディスプレイ | Display device and display method |
JP6359990B2 (en) * | 2015-02-24 | 2018-07-18 | 株式会社ジャパンディスプレイ | Display device and display method |
TWI554788B (en) * | 2015-03-04 | 2016-10-21 | 友達光電股份有限公司 | Display device |
KR102321364B1 (en) * | 2015-03-05 | 2021-11-03 | 삼성전자주식회사 | Method for synthesizing a 3d backgroud content and device thereof |
KR102329107B1 (en) | 2015-03-16 | 2021-11-18 | 레이아 인코포레이티드 | Unidirectional grating-based backlighting employing an angularly selective reflective layer |
JP6411257B2 (en) * | 2015-03-19 | 2018-10-24 | 株式会社ジャパンディスプレイ | Display device and control method thereof |
US9823474B2 (en) | 2015-04-02 | 2017-11-21 | Avegant Corp. | System, apparatus, and method for displaying an image with a wider field of view |
US9995857B2 (en) | 2015-04-03 | 2018-06-12 | Avegant Corp. | System, apparatus, and method for displaying an image using focal modulation |
US9846309B2 (en) * | 2015-04-17 | 2017-12-19 | Dongseo University Technology Headquarters | Depth-priority integral imaging display method using nonuniform dynamic mask array |
CN107533255A (en) | 2015-04-23 | 2018-01-02 | 镭亚股份有限公司 | Backlight based on double light guide gratings and the electronic console using the backlight |
US9705936B2 (en) * | 2015-04-24 | 2017-07-11 | Mersive Technologies, Inc. | System and method for interactive and real-time visualization of distributed media |
US10360617B2 (en) | 2015-04-24 | 2019-07-23 | Walmart Apollo, Llc | Automated shopping apparatus and method in response to consumption |
EP3295242B1 (en) | 2015-05-09 | 2020-05-06 | LEIA Inc. | Colour-scanning grating-based backlight and electronic display using the same |
CN104834104B (en) * | 2015-05-25 | 2017-05-24 | 京东方科技集团股份有限公司 | 2D/3D switchable display panel, and display method and display device thereof |
KR102329110B1 (en) | 2015-05-30 | 2021-11-18 | 레이아 인코포레이티드 | Vehicle monitoring system |
US10904091B2 (en) | 2015-06-03 | 2021-01-26 | Avago Technologies International Sales Pte. Limited | System for network-based reallocation of functions |
CN104883559A (en) * | 2015-06-06 | 2015-09-02 | 深圳市虚拟现实科技有限公司 | Video playing method and video playing device |
CN104851394B (en) * | 2015-06-10 | 2017-11-28 | 京东方科技集团股份有限公司 | A kind of display device and display methods |
CN104849870B (en) * | 2015-06-12 | 2018-01-09 | 京东方科技集团股份有限公司 | Display panel and display device |
US10362342B2 (en) * | 2015-06-16 | 2019-07-23 | Lg Electronics Inc. | Broadcast signal transmission device, broadcast signal reception device, broadcast signal transmission method, and broadcast signal reception method |
US9846310B2 (en) * | 2015-06-22 | 2017-12-19 | Innolux Corporation | 3D image display device with improved depth ranges |
GB2540376A (en) * | 2015-07-14 | 2017-01-18 | Sharp Kk | Parallax barrier with independently controllable regions |
GB2540377A (en) | 2015-07-14 | 2017-01-18 | Sharp Kk | Parallax barrier with independently controllable regions |
FR3038995B1 (en) * | 2015-07-15 | 2018-05-11 | F4 | INTERACTIVE DEVICE WITH CUSTOMIZABLE DISPLAY |
WO2017015056A1 (en) * | 2015-07-17 | 2017-01-26 | Abl Ip Holding Llc | Arrangements for software configurable lighting device |
US10497337B2 (en) | 2015-07-17 | 2019-12-03 | Abl Ip Holding Llc | Systems and methods to provide configuration data to a software configurable lighting device |
KR20180030878A (en) | 2015-07-17 | 2018-03-26 | 에이비엘 아이피 홀딩, 엘엘씨 | Software configurable lighting devices |
US9943247B2 (en) | 2015-07-28 | 2018-04-17 | The University Of Hawai'i | Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan |
US10079000B2 (en) | 2015-08-12 | 2018-09-18 | Microsoft Technology Licensing, Llc | Reducing display degradation |
CN105100783B (en) | 2015-08-19 | 2018-03-23 | 京东方科技集团股份有限公司 | 3D display device and 3D display method |
US10186188B2 (en) * | 2015-09-23 | 2019-01-22 | Motorola Solutions, Inc. | Multi-angle simultaneous view light-emitting diode display |
EP3148188A1 (en) * | 2015-09-24 | 2017-03-29 | Airbus Operations GmbH | Virtual windows for airborne verhicles |
FR3042620B1 (en) | 2015-10-16 | 2017-12-08 | F4 | INTERACTIVE WEB DEVICE WITH CUSTOMIZABLE DISPLAY |
CN106254845B (en) * | 2015-10-20 | 2017-08-25 | 深圳超多维光电子有限公司 | A kind of method of bore hole stereoscopic display, device and electronic equipment |
CN105306866A (en) * | 2015-10-27 | 2016-02-03 | 青岛海信电器股份有限公司 | Frame rate conversion method and device |
US10462453B2 (en) * | 2015-11-10 | 2019-10-29 | Koninklijke Philips N.V. | Display device and display control method |
US11079931B2 (en) | 2015-11-13 | 2021-08-03 | Harman International Industries, Incorporated | User interface for in-vehicle system |
US20170148488A1 (en) * | 2015-11-20 | 2017-05-25 | Mediatek Inc. | Video data processing system and associated method for analyzing and summarizing recorded video data |
US10144419B2 (en) | 2015-11-23 | 2018-12-04 | Magna Electronics Inc. | Vehicle dynamic control system for emergency handling |
US10716515B2 (en) | 2015-11-23 | 2020-07-21 | Kineticor, Inc. | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
US9711128B2 (en) | 2015-12-04 | 2017-07-18 | Opentv, Inc. | Combined audio for multiple content presentation |
MX2018008789A (en) | 2016-01-19 | 2019-03-28 | Walmart Apollo Llc | Consumable item ordering system. |
US10373544B1 (en) | 2016-01-29 | 2019-08-06 | Leia, Inc. | Transformation from tiled to composite images |
NZ744813A (en) | 2016-01-29 | 2019-10-25 | Magic Leap Inc | Display for three-dimensional image |
WO2017156622A1 (en) * | 2016-03-13 | 2017-09-21 | Rising Sun Productions Limited | Head-mounted audiovisual capture device |
US10063917B2 (en) | 2016-03-16 | 2018-08-28 | Sorenson Media Inc. | Fingerprint layouts for content fingerprinting |
US10691880B2 (en) * | 2016-03-29 | 2020-06-23 | Microsoft Technology Licensing, Llc | Ink in an electronic document |
US10200428B1 (en) * | 2016-03-30 | 2019-02-05 | Amazon Technologies, Inc. | Unicast routing of a media stream to subscribers |
US10185787B1 (en) * | 2016-04-06 | 2019-01-22 | Bentley Systems, Incorporated | Tool for accurate onsite model visualization that facilitates environment interaction |
US10256277B2 (en) * | 2016-04-11 | 2019-04-09 | Abl Ip Holding Llc | Luminaire utilizing a transparent organic light emitting device display |
US10663755B2 (en) * | 2016-04-28 | 2020-05-26 | Hewlett-Packard Development Company, L.P. | Digital displays devices |
US10353534B2 (en) | 2016-05-13 | 2019-07-16 | Sap Se | Overview page in multi application user interface |
US10579238B2 (en) | 2016-05-13 | 2020-03-03 | Sap Se | Flexible screen layout across multiple platforms |
TWI626475B (en) * | 2016-06-08 | 2018-06-11 | 國立交通大學 | Stereoscopic display screen and stereoscopic display system |
EP3472832A4 (en) | 2016-06-17 | 2020-03-11 | DTS, Inc. | Distance panning using near / far-field rendering |
CN105842865B (en) * | 2016-06-21 | 2018-01-30 | 成都工业学院 | A kind of slim grating 3D display device based on slit grating |
CN106257321B (en) * | 2016-06-28 | 2021-11-30 | 京东方科技集团股份有限公司 | 3D head-up display system and method |
US20180035236A1 (en) * | 2016-07-28 | 2018-02-01 | Leonardo Basterra | Audio System with Binaural Elements and Method of Use with Perspective Switching |
US10235010B2 (en) | 2016-07-28 | 2019-03-19 | Canon Kabushiki Kaisha | Information processing apparatus configured to generate an audio signal corresponding to a virtual viewpoint image, information processing system, information processing method, and non-transitory computer-readable storage medium |
US10089063B2 (en) | 2016-08-10 | 2018-10-02 | Qualcomm Incorporated | Multimedia device for processing spatialized audio based on movement |
US10154253B2 (en) * | 2016-08-29 | 2018-12-11 | Disney Enterprises, Inc. | Multi-view displays using images encoded with orbital angular momentum (OAM) on a pixel or image basis |
WO2018044711A1 (en) * | 2016-08-31 | 2018-03-08 | Wal-Mart Stores, Inc. | Systems and methods of enabling retail shopping while disabling components based on location |
US10271043B2 (en) | 2016-11-18 | 2019-04-23 | Zspace, Inc. | 3D user interface—360-degree visualization of 2D webpage content |
US10127715B2 (en) * | 2016-11-18 | 2018-11-13 | Zspace, Inc. | 3D user interface—non-native stereoscopic image conversion |
US11003305B2 (en) * | 2016-11-18 | 2021-05-11 | Zspace, Inc. | 3D user interface |
US10621898B2 (en) * | 2016-11-23 | 2020-04-14 | Pure Depth Limited | Multi-layer display system for vehicle dash or the like |
GB2556910A (en) * | 2016-11-25 | 2018-06-13 | Nokia Technologies Oy | Virtual reality display |
US10170060B2 (en) * | 2016-12-27 | 2019-01-01 | Facebook Technologies, Llc | Interlaced liquid crystal display panel and backlight used in a head mounted display |
US11051061B2 (en) | 2016-12-31 | 2021-06-29 | Turner Broadcasting System, Inc. | Publishing a disparate live media output stream using pre-encoded media assets |
US11038932B2 (en) | 2016-12-31 | 2021-06-15 | Turner Broadcasting System, Inc. | System for establishing a shared media session for one or more client devices |
US11503352B2 (en) | 2016-12-31 | 2022-11-15 | Turner Broadcasting System, Inc. | Dynamic scheduling and channel creation based on external data |
US10856016B2 (en) | 2016-12-31 | 2020-12-01 | Turner Broadcasting System, Inc. | Publishing disparate live media output streams in mixed mode based on user selection |
US11134309B2 (en) | 2016-12-31 | 2021-09-28 | Turner Broadcasting System, Inc. | Creation of channels using pre-encoded media assets |
US10425700B2 (en) | 2016-12-31 | 2019-09-24 | Turner Broadcasting System, Inc. | Dynamic scheduling and channel creation based on real-time or near-real-time content context analysis |
US10965967B2 (en) | 2016-12-31 | 2021-03-30 | Turner Broadcasting System, Inc. | Publishing a disparate per-client live media output stream based on dynamic insertion of targeted non-programming content and customized programming content |
US10645462B2 (en) | 2016-12-31 | 2020-05-05 | Turner Broadcasting System, Inc. | Dynamic channel versioning in a broadcast air chain |
US10075753B2 (en) | 2016-12-31 | 2018-09-11 | Turner Broadcasting System, Inc. | Dynamic scheduling and channel creation based on user selection |
US10992973B2 (en) | 2016-12-31 | 2021-04-27 | Turner Broadcasting System, Inc. | Publishing a plurality of disparate live media output stream manifests using live input streams and pre-encoded media assets |
US11051074B2 (en) | 2016-12-31 | 2021-06-29 | Turner Broadcasting System, Inc. | Publishing disparate live media output streams using live input streams |
US11109086B2 (en) | 2016-12-31 | 2021-08-31 | Turner Broadcasting System, Inc. | Publishing disparate live media output streams in mixed mode |
US10694231B2 (en) | 2016-12-31 | 2020-06-23 | Turner Broadcasting System, Inc. | Dynamic channel versioning in a broadcast air chain based on user preferences |
CN108287679A (en) * | 2017-01-10 | 2018-07-17 | 中兴通讯股份有限公司 | A kind of display characteristic parameter adjusting method and terminal |
CN106710531B (en) * | 2017-01-19 | 2019-11-05 | 深圳市华星光电技术有限公司 | Backlight control circuit and electronic device |
US11044464B2 (en) * | 2017-02-09 | 2021-06-22 | Fyusion, Inc. | Dynamic content modification of image and video based multi-view interactive digital media representations |
US10650416B1 (en) * | 2017-02-17 | 2020-05-12 | Sprint Communications Company L.P. | Live production interface and response testing |
US10210833B2 (en) * | 2017-03-31 | 2019-02-19 | Panasonic Liquid Crystal Display Co., Ltd. | Display device |
US10078135B1 (en) * | 2017-04-25 | 2018-09-18 | Intel Corporation | Identifying a physical distance using audio channels |
JP7089583B2 (en) | 2017-05-14 | 2022-06-22 | レイア、インコーポレイテッド | Multi-view backlight, display, and method with active emitter |
FR3066672B1 (en) * | 2017-05-19 | 2020-05-22 | Sagemcom Broadband Sas | METHOD FOR COMMUNICATING AN IMMERSIVE VIDEO |
US10939169B2 (en) | 2017-05-25 | 2021-03-02 | Turner Broadcasting System, Inc. | Concurrent presentation of non-programming media assets with programming media content at client device |
CN116666814A (en) | 2017-05-30 | 2023-08-29 | 奇跃公司 | Power supply assembly with fan assembly for electronic device |
WO2018231258A1 (en) * | 2017-06-16 | 2018-12-20 | Microsoft Technology Licensing, Llc | Generating user interface containers |
CN107146573B (en) * | 2017-06-26 | 2020-05-01 | 上海天马有机发光显示技术有限公司 | Display panel, display method thereof and display device |
EP3422151A1 (en) * | 2017-06-30 | 2019-01-02 | Nokia Technologies Oy | Methods, apparatus, systems, computer programs for enabling consumption of virtual content for mediated reality |
US20190026004A1 (en) * | 2017-07-18 | 2019-01-24 | Chicago Labs, LLC | Three Dimensional Icons for Computer Applications |
IL271963B (en) | 2017-07-28 | 2022-08-01 | Magic Leap Inc | Fan assembly for displaying an image |
CN107396087B (en) * | 2017-07-31 | 2019-03-12 | 京东方科技集团股份有限公司 | Naked eye three-dimensional display device and its control method |
US10692279B2 (en) * | 2017-07-31 | 2020-06-23 | Quantum Spatial, Inc. | Systems and methods for facilitating making partial selections of multidimensional information while maintaining a multidimensional structure |
US11049218B2 (en) | 2017-08-11 | 2021-06-29 | Samsung Electronics Company, Ltd. | Seamless image stitching |
US10515397B2 (en) * | 2017-09-08 | 2019-12-24 | Uptown Network LLC | System and method for facilitating virtual gift giving |
CN107707901B (en) * | 2017-09-30 | 2019-10-25 | 深圳超多维科技有限公司 | It is a kind of for the display methods of naked eye 3D display screen, device and equipment |
CN108205411A (en) * | 2017-09-30 | 2018-06-26 | 中兴通讯股份有限公司 | Display changeover method and device, terminal |
US10777057B1 (en) * | 2017-11-30 | 2020-09-15 | Amazon Technologies, Inc. | Premises security system with audio simulating occupancy |
US10212532B1 (en) | 2017-12-13 | 2019-02-19 | At&T Intellectual Property I, L.P. | Immersive media with media device |
EP3503579B1 (en) * | 2017-12-20 | 2022-03-23 | Nokia Technologies Oy | Multi-camera device |
US11132842B2 (en) * | 2017-12-22 | 2021-09-28 | Unity IPR ApS | Method and system for synchronizing a plurality of augmented reality devices to a virtual reality device |
JP2019154008A (en) * | 2018-03-06 | 2019-09-12 | シャープ株式会社 | Stereoscopic image display device, method for displaying liquid crystal display, and program for liquid crystal display |
CN108469682A (en) * | 2018-03-30 | 2018-08-31 | 京东方科技集团股份有限公司 | A kind of three-dimensional display apparatus and its 3 D displaying method |
CN108490703B (en) * | 2018-04-03 | 2021-10-15 | 京东方科技集团股份有限公司 | Display system and display control method thereof |
US11025892B1 (en) | 2018-04-04 | 2021-06-01 | James Andrew Aman | System and method for simultaneously providing public and private images |
US10523921B2 (en) * | 2018-04-06 | 2019-12-31 | Zspace, Inc. | Replacing 2D images with 3D images |
US10523922B2 (en) * | 2018-04-06 | 2019-12-31 | Zspace, Inc. | Identifying replacement 3D images for 2D images via ranking criteria |
WO2019199359A1 (en) | 2018-04-08 | 2019-10-17 | Dts, Inc. | Ambisonic depth extraction |
KR102406219B1 (en) * | 2018-04-11 | 2022-06-08 | 알카크루즈 인코포레이티드 | digital media system |
US10999573B2 (en) * | 2018-04-25 | 2021-05-04 | Raxium, Inc. | Partial light field display architecture |
WO2019207440A1 (en) | 2018-04-26 | 2019-10-31 | 株式会社半導体エネルギー研究所 | Display device and electronic apparatus |
EP3579584A1 (en) | 2018-06-07 | 2019-12-11 | Nokia Technologies Oy | Controlling rendering of a spatial audio scene |
US10600246B2 (en) * | 2018-06-15 | 2020-03-24 | Microsoft Technology Licensing, Llc | Pinning virtual reality passthrough regions to real-world locations |
KR102506873B1 (en) * | 2018-07-18 | 2023-03-08 | 현대자동차주식회사 | Vehicle cluster having a three-dimensional effect, system having the same and method providing a three-dimensional scene thereof |
EP3832638A4 (en) * | 2018-07-27 | 2022-04-27 | Kyocera Corporation | Display device and mobile body |
US11212506B2 (en) | 2018-07-31 | 2021-12-28 | Intel Corporation | Reduced rendering of six-degree of freedom video |
US10762394B2 (en) | 2018-07-31 | 2020-09-01 | Intel Corporation | System and method for 3D blob classification and transmission |
US10893299B2 (en) | 2018-07-31 | 2021-01-12 | Intel Corporation | Surface normal vector processing mechanism |
US11178373B2 (en) | 2018-07-31 | 2021-11-16 | Intel Corporation | Adaptive resolution of point cloud and viewpoint prediction for video streaming in computing environments |
US10887574B2 (en) | 2018-07-31 | 2021-01-05 | Intel Corporation | Selective packing of patches for immersive video |
US11057631B2 (en) | 2018-10-10 | 2021-07-06 | Intel Corporation | Point cloud coding standard conformance definition in computing environments |
US11727859B2 (en) | 2018-10-25 | 2023-08-15 | Boe Technology Group Co., Ltd. | Display panel and display device |
CN109192136B (en) * | 2018-10-25 | 2020-12-22 | 京东方科技集团股份有限公司 | Display substrate, light field display device and driving method thereof |
KR102023905B1 (en) * | 2018-11-09 | 2019-11-04 | 전자부품연구원 | Electronic device and method for multi-channel reproduction of tiled image |
US10880534B2 (en) * | 2018-11-09 | 2020-12-29 | Korea Electronics Technology Institute | Electronic device and method for tiled video multi-channel playback |
US10699673B2 (en) * | 2018-11-19 | 2020-06-30 | Facebook Technologies, Llc | Apparatus, systems, and methods for local dimming in brightness-controlled environments |
CN109598254B (en) * | 2018-12-17 | 2019-11-26 | 海南大学 | The space representation combined optimization method of Group-oriented |
US10880606B2 (en) | 2018-12-21 | 2020-12-29 | Turner Broadcasting System, Inc. | Disparate live media output stream playout and broadcast distribution |
US11082734B2 (en) | 2018-12-21 | 2021-08-03 | Turner Broadcasting System, Inc. | Publishing a disparate live media output stream that complies with distribution format regulations |
US10873774B2 (en) | 2018-12-22 | 2020-12-22 | Turner Broadcasting System, Inc. | Publishing a disparate live media output stream manifest that includes one or more media segments corresponding to key events |
CN109725819B (en) * | 2018-12-25 | 2022-12-13 | 浙江玖炫智能信息技术有限公司 | Interface display method and device, double-screen double-system terminal and readable storage medium |
US10854171B2 (en) | 2018-12-31 | 2020-12-01 | Samsung Electronics Co., Ltd. | Multi-user personal display system and applications thereof |
EP3687166A1 (en) * | 2019-01-23 | 2020-07-29 | Ultra-D Coöperatief U.A. | Interoperable 3d image content handling |
CN109686303B (en) * | 2019-01-28 | 2021-09-17 | 厦门天马微电子有限公司 | Organic light-emitting display panel, organic light-emitting display device and compensation method |
JP7317517B2 (en) * | 2019-02-12 | 2023-07-31 | 株式会社ジャパンディスプレイ | Display device |
US10932080B2 (en) | 2019-02-14 | 2021-02-23 | Microsoft Technology Licensing, Llc | Multi-sensor object tracking for modifying audio |
CN110007475A (en) * | 2019-04-17 | 2019-07-12 | 万维云视(上海)数码科技有限公司 | Utilize the method and apparatus of virtual depth compensation eyesight |
US10504453B1 (en) | 2019-04-18 | 2019-12-10 | Apple Inc. | Displays with adjustable direct-lit backlight units |
US10571744B1 (en) | 2019-04-18 | 2020-02-25 | Apple Inc. | Displays with adjustable direct-lit backlight units and power consumption compensation |
US10964275B2 (en) | 2019-04-18 | 2021-03-30 | Apple Inc. | Displays with adjustable direct-lit backlight units and adaptive processing |
US20220068185A1 (en) * | 2019-04-29 | 2022-03-03 | Hewlett-Packard Development Company, L.P. | Wireless configuration of display attribute |
CN110262051B (en) * | 2019-07-26 | 2023-12-29 | 成都工业学院 | Retroreflective stereoscopic display device based on directional light source |
EP3779612A1 (en) * | 2019-08-16 | 2021-02-17 | The Swatch Group Research and Development Ltd | Method for broadcasting a message to the wearer of a watch |
CN112394845B (en) * | 2019-08-19 | 2024-03-01 | 北京小米移动软件有限公司 | Distance sensor module, display device, electronic equipment and distance detection method |
US11335095B1 (en) * | 2019-08-27 | 2022-05-17 | Gopro, Inc. | Systems and methods for characterizing visual content |
EP4025953A4 (en) * | 2019-09-03 | 2023-10-04 | Light Field Lab, Inc. | Light field display system for gaming environments |
CN111415629B (en) * | 2020-04-28 | 2022-02-22 | Tcl华星光电技术有限公司 | Display device driving method and display device |
US11750795B2 (en) | 2020-05-12 | 2023-09-05 | Apple Inc. | Displays with viewer tracking |
US11936844B1 (en) | 2020-08-11 | 2024-03-19 | Apple Inc. | Pre-processing in a display pipeline |
CN112505942B (en) * | 2021-02-03 | 2021-04-20 | 成都工业学院 | Multi-resolution stereoscopic display device based on rear projection light source |
CN113992885B (en) * | 2021-09-22 | 2023-03-21 | 联想(北京)有限公司 | Data synchronization method and device |
NL2030325B1 (en) * | 2021-12-28 | 2023-07-03 | Dimenco Holding B V | Scaling of three-dimensional content for an autostereoscopic display device |
KR20230112485A (en) * | 2022-01-20 | 2023-07-27 | 엘지전자 주식회사 | Display device and operating method thereof |
CN114936002A (en) * | 2022-06-10 | 2022-08-23 | 斑马网络技术有限公司 | Interface display method and device and vehicle |
Citations (99)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4829365A (en) * | 1986-03-07 | 1989-05-09 | Dimension Technologies, Inc. | Autostereoscopic display with illuminating lines, light valve and mask |
US5615046A (en) * | 1995-01-23 | 1997-03-25 | Cyber Scientific Inc. | Stereoscopic viewing system |
US5855425A (en) * | 1996-07-19 | 1999-01-05 | Sanyo Electric Co., Ltd. | Stereoscopic display |
US5945965A (en) * | 1995-06-29 | 1999-08-31 | Canon Kabushiki Kaisha | Stereoscopic image display method |
US5959597A (en) * | 1995-09-28 | 1999-09-28 | Sony Corporation | Image/audio reproducing system |
US5969850A (en) * | 1996-09-27 | 1999-10-19 | Sharp Kabushiki Kaisha | Spatial light modulator, directional display and directional light source |
US5990975A (en) * | 1996-11-22 | 1999-11-23 | Acer Peripherals, Inc. | Dual screen displaying device |
US6023277A (en) * | 1996-07-03 | 2000-02-08 | Canon Kabushiki Kaisha | Display control apparatus and method |
US6049424A (en) * | 1995-11-15 | 2000-04-11 | Sanyo Electric Co., Ltd. | Three dimensional display device |
US6094216A (en) * | 1995-05-22 | 2000-07-25 | Canon Kabushiki Kaisha | Stereoscopic image display method, and stereoscopic image display apparatus using the method |
US6144375A (en) * | 1998-08-14 | 2000-11-07 | Praja Inc. | Multi-perspective viewer for content-based interactivity |
US6188442B1 (en) * | 1997-08-01 | 2001-02-13 | International Business Machines Corporation | Multiviewer display system for television monitors |
US6285368B1 (en) * | 1997-02-10 | 2001-09-04 | Canon Kabushiki Kaisha | Image display system and image display apparatus and information processing apparatus in the system |
US20020010798A1 (en) * | 2000-04-20 | 2002-01-24 | Israel Ben-Shaul | Differentiated content and application delivery via internet |
US20020037037A1 (en) * | 2000-09-22 | 2002-03-28 | Philips Electronics North America Corporation | Preferred transmission/streaming order of fine-granular scalability |
US20020167862A1 (en) * | 2001-04-03 | 2002-11-14 | Carlo Tomasi | Method and apparatus for approximating a source position of a sound-causing event for determining an input used in operating an electronic device |
US20020171666A1 (en) * | 1999-02-19 | 2002-11-21 | Takaaki Endo | Image processing apparatus for interpolating and generating images from an arbitrary view point |
US20030012425A1 (en) * | 1998-11-12 | 2003-01-16 | Canon Kabushiki Kaisha | Viewpoint position detection apparatus and method, and stereoscopic image display system |
US20030103165A1 (en) * | 2000-05-19 | 2003-06-05 | Werner Bullinger | System for operating a consumer electronics appaliance |
US20030137506A1 (en) * | 2001-11-30 | 2003-07-24 | Daniel Efran | Image-based rendering for 3D viewing |
US20030154261A1 (en) * | 1994-10-17 | 2003-08-14 | The Regents Of The University Of California, A Corporation Of The State Of California | Distributed hypermedia method and system for automatically invoking external application providing interaction and display of embedded objects within a hypermedia document |
US20030223499A1 (en) * | 2002-04-09 | 2003-12-04 | Nicholas Routhier | Process and system for encoding and playback of stereoscopic video sequences |
US20040027452A1 (en) * | 2002-08-07 | 2004-02-12 | Yun Kug Jin | Method and apparatus for multiplexing multi-view three-dimensional moving picture |
US6697687B1 (en) * | 1998-11-09 | 2004-02-24 | Hitachi, Ltd. | Image display apparatus having audio output control means in accordance with image signal type |
US20040036763A1 (en) * | 1994-11-14 | 2004-02-26 | Swift David C. | Intelligent method and system for producing and displaying stereoscopically-multiplexed images of three-dimensional objects for use in realistic stereoscopic viewing thereof in interactive virtual reality display environments |
US20040041747A1 (en) * | 2002-08-27 | 2004-03-04 | Nec Corporation | 3D image/2D image switching display apparatus and portable terminal device |
US6710920B1 (en) * | 1998-03-27 | 2004-03-23 | Sanyo Electric Co., Ltd | Stereoscopic display |
US20040109093A1 (en) * | 2002-12-05 | 2004-06-10 | Small-Stryker Aaron Tug | Method and apparatus for simultaneous television video presentation and separate viewing of different broadcasts |
US20040141237A1 (en) * | 1995-06-07 | 2004-07-22 | Wohlstadter Jacob N. | Three dimensional imaging system |
US20040164292A1 (en) * | 2003-02-21 | 2004-08-26 | Yeh-Jiun Tung | Transflective display having an OLED backlight |
US20040239231A1 (en) * | 2002-10-30 | 2004-12-02 | Keisuke Miyagawa | Display device and electronic equipment |
US20040252187A1 (en) * | 2001-09-10 | 2004-12-16 | Alden Ray M. | Processes and apparatuses for efficient multiple program and 3D display |
US20050073472A1 (en) * | 2003-07-26 | 2005-04-07 | Samsung Electronics Co., Ltd. | Method of removing Moire pattern in 3D image display apparatus using complete parallax |
US20050128353A1 (en) * | 2003-12-16 | 2005-06-16 | Young Bruce A. | System and method for using second remote control device for sub-picture control in television receiver |
US20050237487A1 (en) * | 2004-04-23 | 2005-10-27 | Chang Nelson L A | Color wheel assembly for stereoscopic imaging |
US20050248561A1 (en) * | 2002-04-25 | 2005-11-10 | Norio Ito | Multimedia information generation method and multimedia information reproduction device |
US20050259147A1 (en) * | 2002-07-16 | 2005-11-24 | Nam Jeho | Apparatus and method for adapting 2d and 3d stereoscopic video signal |
US20060050785A1 (en) * | 2004-09-09 | 2006-03-09 | Nucore Technology Inc. | Inserting a high resolution still image into a lower resolution video stream |
US7030903B2 (en) * | 1997-02-20 | 2006-04-18 | Canon Kabushiki Kaisha | Image display system, information processing apparatus, and method of controlling the same |
US7038698B1 (en) * | 1996-02-08 | 2006-05-02 | Palm Charles S | 3D stereo browser for the internet |
US20060109242A1 (en) * | 2004-11-19 | 2006-05-25 | Simpkins Daniel S | User interface for impaired users |
US20060139448A1 (en) * | 2004-12-29 | 2006-06-29 | Samsung Electronics Co., Ltd. | 3D displays with flexible switching capability of 2D/3D viewing modes |
US20060139490A1 (en) * | 2004-12-15 | 2006-06-29 | Fekkes Wilhelmus F | Synchronizing audio with delayed video |
US7091471B2 (en) * | 2004-03-15 | 2006-08-15 | Agilent Technologies, Inc. | Using eye detection for providing control and power management of electronic devices |
US7123213B2 (en) * | 1995-10-05 | 2006-10-17 | Semiconductor Energy Laboratory Co., Ltd. | Three dimensional display unit and display method |
US20060244918A1 (en) * | 2005-04-27 | 2006-11-02 | Actuality Systems, Inc. | Minimized-thickness angular scanner of electromagnetic radiation |
US20060256302A1 (en) * | 2005-05-13 | 2006-11-16 | Microsoft Corporation | Three-dimensional (3D) image projection |
US20060256136A1 (en) * | 2001-10-01 | 2006-11-16 | Adobe Systems Incorporated, A Delaware Corporation | Compositing two-dimensional and three-dimensional image layers |
US20060271791A1 (en) * | 2005-05-27 | 2006-11-30 | Sbc Knowledge Ventures, L.P. | Method and system for biometric based access control of media content presentation devices |
US20070002041A1 (en) * | 2005-07-02 | 2007-01-04 | Samsung Electronics Co., Ltd. | Method and apparatus for encoding/decoding video data to implement local three-dimensional video |
US20070008620A1 (en) * | 2005-07-11 | 2007-01-11 | Samsung Electronics Co., Ltd. | Switchable autostereoscopic display |
US20070008406A1 (en) * | 2005-07-08 | 2007-01-11 | Samsung Electronics Co., Ltd. | High resolution 2D-3D switchable autostereoscopic display apparatus |
US20070052807A1 (en) * | 2005-09-07 | 2007-03-08 | Fuji Xerox Co., Ltd. | System and method for user monitoring interface of 3-D video streams from multiple cameras |
US7190518B1 (en) * | 1996-01-22 | 2007-03-13 | 3Ality, Inc. | Systems for and methods of three dimensional viewing |
US20070072674A1 (en) * | 2005-09-12 | 2007-03-29 | Nintendo Co., Ltd. | Information processing program |
US20070085814A1 (en) * | 2003-09-20 | 2007-04-19 | Koninklijke Philips Electronics N.V. | Image display device |
US20070097103A1 (en) * | 2003-09-11 | 2007-05-03 | Shoji Yoshioka | Portable display device |
US20070096125A1 (en) * | 2005-06-24 | 2007-05-03 | Uwe Vogel | Illumination device |
US20070097208A1 (en) * | 2003-05-28 | 2007-05-03 | Satoshi Takemoto | Stereoscopic image display apparatus, text data processing apparatus, program, and storing medium |
US20070139371A1 (en) * | 2005-04-04 | 2007-06-21 | Harsham Bret A | Control system and method for differentiating multiple users utilizing multi-view display devices |
US20070147827A1 (en) * | 2005-12-28 | 2007-06-28 | Arnold Sheynman | Methods and apparatus for wireless stereo video streaming |
US20070146267A1 (en) * | 2005-12-22 | 2007-06-28 | Lg.Philips Lcd Co., Ltd. | Display device and method of driving the same |
US20070153916A1 (en) * | 2005-12-30 | 2007-07-05 | Sharp Laboratories Of America, Inc. | Wireless video transmission system |
US20070162392A1 (en) * | 2006-01-12 | 2007-07-12 | Microsoft Corporation | Management of Streaming Content |
US20070258140A1 (en) * | 2006-05-04 | 2007-11-08 | Samsung Electronics Co., Ltd. | Multiview autostereoscopic display |
US20070270218A1 (en) * | 2006-05-08 | 2007-11-22 | Nintendo Co., Ltd. | Storage medium having game program stored thereon and game apparatus |
US20070296874A1 (en) * | 2004-10-20 | 2007-12-27 | Fujitsu Ten Limited | Display Device,Method of Adjusting the Image Quality of the Display Device, Device for Adjusting the Image Quality and Device for Adjusting the Contrast |
US20080025390A1 (en) * | 2006-07-25 | 2008-01-31 | Fang Shi | Adaptive video frame interpolation |
US20080037120A1 (en) * | 2006-08-08 | 2008-02-14 | Samsung Electronics Co., Ltd | High resolution 2d/3d switchable display apparatus |
US20080043096A1 (en) * | 2006-04-04 | 2008-02-21 | Anthony Vetro | Method and System for Decoding and Displaying 3D Light Fields |
US20080043644A1 (en) * | 2006-08-18 | 2008-02-21 | Microsoft Corporation | Techniques to perform rate matching for multimedia conference calls |
US20080068329A1 (en) * | 2006-09-15 | 2008-03-20 | Samsung Electronics Co., Ltd. | Multi-view autostereoscopic display with improved resolution |
US7359105B2 (en) * | 2006-02-07 | 2008-04-15 | Sharp Kabushiki Kaisha | Spatial light modulator and a display device |
US20080126557A1 (en) * | 2006-09-08 | 2008-05-29 | Tetsuro Motoyama | System, method, and computer program product using an SNMP implementation to obtain vendor information from remote devices |
US20080133122A1 (en) * | 2006-03-29 | 2008-06-05 | Sanyo Electric Co., Ltd. | Multiple visual display device and vehicle-mounted navigation system |
US20080150853A1 (en) * | 2006-12-22 | 2008-06-26 | Hong Kong Applied Science and Technology Research Institute Company Limited | Backlight device and liquid crystal display incorporating the backlight device |
US20080165176A1 (en) * | 2006-09-28 | 2008-07-10 | Charles Jens Archer | Method of Video Display and Multiplayer Gaming |
US20080168129A1 (en) * | 2007-01-08 | 2008-07-10 | Jeffrey Robbin | Pairing a Media Server and a Media Client |
US20080184301A1 (en) * | 1999-10-29 | 2008-07-31 | Boylan Peter C | Interactive television system with programming-related links |
US20080192112A1 (en) * | 2005-03-18 | 2008-08-14 | Ntt Data Sanyo System Corporation | Stereoscopic Image Display Apparatus, Stereoscopic Image Displaying Method And Computer Program Product |
US20080191964A1 (en) * | 2005-04-22 | 2008-08-14 | Koninklijke Philips Electronics, N.V. | Auto-Stereoscopic Display With Mixed Mode For Concurrent Display of Two- and Three-Dimensional Images |
US20080246757A1 (en) * | 2005-04-25 | 2008-10-09 | Masahiro Ito | 3D Image Generation and Display System |
US7440193B2 (en) * | 2004-04-30 | 2008-10-21 | Gunasekaran R Alfred | Wide-angle variable focal length lens system |
US20080259233A1 (en) * | 2005-12-20 | 2008-10-23 | Koninklijke Philips Electronics, N.V. | Autostereoscopic Display Device |
US20080273242A1 (en) * | 2003-09-30 | 2008-11-06 | Graham John Woodgate | Directional Display Apparatus |
US20080284844A1 (en) * | 2003-02-05 | 2008-11-20 | Graham John Woodgate | Switchable Lens |
US20080303832A1 (en) * | 2007-06-11 | 2008-12-11 | Samsung Electronics Co., Ltd. | Method of generating two-dimensional/three-dimensional convertible stereoscopic image bitstream and method and apparatus for displaying the same |
US20090002178A1 (en) * | 2007-06-29 | 2009-01-01 | Microsoft Corporation | Dynamic mood sensing |
US20090010264A1 (en) * | 2006-03-21 | 2009-01-08 | Huawei Technologies Co., Ltd. | Method and System for Ensuring QoS and SLA Server |
US20090051759A1 (en) * | 2005-05-27 | 2009-02-26 | Adkins Sean M | Equipment and methods for the synchronization of stereoscopic projection displays |
US20090052164A1 (en) * | 2007-08-24 | 2009-02-26 | Masako Kashiwagi | Directional backlight, display apparatus, and stereoscopic display apparatus |
US20090058845A1 (en) * | 2004-10-20 | 2009-03-05 | Yasuhiro Fukuda | Display device |
US7511774B2 (en) * | 2005-11-30 | 2009-03-31 | Samsung Mobile Display Co., Ltd. | Three-dimensional display device |
US20090102915A1 (en) * | 2005-04-25 | 2009-04-23 | Svyatoslav Ivanovich Arsenich | Stereoprojection system |
US20090115783A1 (en) * | 2007-11-02 | 2009-05-07 | Dimension Technologies, Inc. | 3d optical illusions from off-axis displays |
US20090115800A1 (en) * | 2005-01-18 | 2009-05-07 | Koninklijke Philips Electronics, N.V. | Multi-view display device |
US20090133051A1 (en) * | 2007-11-21 | 2009-05-21 | Gesturetek, Inc. | Device access control |
US20090138805A1 (en) * | 2007-11-21 | 2009-05-28 | Gesturetek, Inc. | Media preferences |
US20100045782A1 (en) * | 2008-08-25 | 2010-02-25 | Chihiro Morita | Content reproducing apparatus and method |
Family Cites Families (164)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS56109649A (en) | 1980-02-05 | 1981-08-31 | Matsushita Electric Ind Co Ltd | Ultrasonic diagnosing device |
JPH05122733A (en) * | 1991-10-28 | 1993-05-18 | Nippon Hoso Kyokai <Nhk> | Three-dimensional picture display device |
US5493427A (en) * | 1993-05-25 | 1996-02-20 | Sharp Kabushiki Kaisha | Three-dimensional display unit with a variable lens |
JPH10232626A (en) * | 1997-02-20 | 1998-09-02 | Canon Inc | Stereoscopic image display device |
US6590605B1 (en) | 1998-10-14 | 2003-07-08 | Dimension Technologies, Inc. | Autostereoscopic display |
US6533420B1 (en) | 1999-01-22 | 2003-03-18 | Dimension Technologies, Inc. | Apparatus and method for generating and projecting autostereoscopic images |
US6591306B1 (en) * | 1999-04-01 | 2003-07-08 | Nec Corporation | IP network access for portable devices |
US8271336B2 (en) | 1999-11-22 | 2012-09-18 | Accenture Global Services Gmbh | Increased visibility during order management in a network-based supply chain environment |
US7389214B1 (en) | 2000-05-01 | 2008-06-17 | Accenture, Llp | Category analysis in a market management |
AU2001266862A1 (en) * | 2000-06-12 | 2001-12-24 | Vrex, Inc. | Electronic stereoscopic media delivery system |
US6856581B1 (en) | 2000-10-31 | 2005-02-15 | International Business Machines Corporation | Batteryless, oscillatorless, binary time cell usable as an horological device with associated programming methods and devices |
WO2002037471A2 (en) | 2000-11-03 | 2002-05-10 | Zoesis, Inc. | Interactive character system |
DE10103922A1 (en) | 2001-01-30 | 2002-08-01 | Physoptics Opto Electronic Gmb | Interactive data viewing and operating system |
US20020194604A1 (en) | 2001-06-19 | 2002-12-19 | Sanchez Elizabeth C. | Interactive television virtual shopping cart |
JP2003322824A (en) * | 2002-02-26 | 2003-11-14 | Namco Ltd | Stereoscopic video display device and electronic apparatus |
JP3738843B2 (en) | 2002-06-11 | 2006-01-25 | ソニー株式会社 | Image detection apparatus, image detection method, and image detection program |
JP2004072202A (en) | 2002-08-01 | 2004-03-04 | Ktfreetel Co Ltd | Separate billing method of communication utility charge and apparatus therefor |
US20080008202A1 (en) | 2002-10-31 | 2008-01-10 | Terrell William C | Router with routing processors and methods for virtualization |
US7769668B2 (en) * | 2002-12-09 | 2010-08-03 | Sam Balabon | System and method for facilitating trading of financial instruments |
US8799366B2 (en) | 2002-12-11 | 2014-08-05 | Broadcom Corporation | Migration of stored media through a media exchange network |
US8270810B2 (en) | 2002-12-11 | 2012-09-18 | Broadcom Corporation | Method and system for advertisement insertion and playback for STB with PVR functionality |
CA2457602A1 (en) | 2003-02-19 | 2004-08-19 | Impatica Inc. | Method of synchronizing streams of real time data |
US8438601B2 (en) | 2003-07-02 | 2013-05-07 | Rovi Solutions Corporation | Resource management for a networked personal video recording system |
US7557876B2 (en) * | 2003-07-25 | 2009-07-07 | Nitto Denko Corporation | Anisotropic fluorescent thin crystal film and backlight system and liquid crystal display incorporating the same |
GB0326005D0 (en) | 2003-11-07 | 2003-12-10 | Koninkl Philips Electronics Nv | Waveguide for autostereoscopic display |
US7488072B2 (en) | 2003-12-04 | 2009-02-10 | New York University | Eye tracked foveal display by controlled illumination |
US8154686B2 (en) | 2004-01-20 | 2012-04-10 | Sharp Kabushiki Kaisha | Directional backlight, a multiple view display and a multi-direction display |
US20060087556A1 (en) * | 2004-10-21 | 2006-04-27 | Kazunari Era | Stereoscopic image display device |
JP2008522226A (en) | 2004-11-30 | 2008-06-26 | アグーラ テクノロジーズ インコーポレイテッド | Application and fabrication technology of large-scale wire grid polarizer |
KR100786862B1 (en) | 2004-11-30 | 2007-12-20 | 삼성에스디아이 주식회사 | Barrier device, three dimensional image display using the same and method thereof |
WO2006061801A1 (en) | 2004-12-10 | 2006-06-15 | Koninklijke Philips Electronics, N.V. | Wireless video streaming using single layer coding and prioritized streaming |
JP4600317B2 (en) | 2005-03-31 | 2010-12-15 | カシオ計算機株式会社 | Illumination device that emits at least two illumination lights having directivity and display device using the same |
KR100732961B1 (en) | 2005-04-01 | 2007-06-27 | 경희대학교 산학협력단 | Multiview scalable image encoding, decoding method and its apparatus |
EP3522529B1 (en) * | 2005-04-29 | 2021-01-27 | Koninklijke Philips N.V. | A stereoscopic display apparatus |
KR100661241B1 (en) * | 2005-05-16 | 2006-12-22 | 엘지전자 주식회사 | Fabrication method of optical sheet |
GB2426351A (en) * | 2005-05-19 | 2006-11-22 | Sharp Kk | A dual view display |
KR100813961B1 (en) * | 2005-06-14 | 2008-03-14 | 삼성전자주식회사 | Method and apparatus for transmitting and receiving of video, and transport stream structure thereof |
JP5091857B2 (en) | 2005-06-30 | 2012-12-05 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | System control method |
KR100647517B1 (en) | 2005-08-26 | 2006-11-23 | (주)마스터이미지 | Cell type parallax-barrier and stereoscopic image display apparatus using the same |
JP5112326B2 (en) | 2005-11-02 | 2013-01-09 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Optical system for 3D display |
US20070110035A1 (en) | 2005-11-14 | 2007-05-17 | Broadcom Corporation, A California Corporation | Network nodes cooperatively routing traffic flow amongst wired and wireless networks |
JP5121136B2 (en) * | 2005-11-28 | 2013-01-16 | 株式会社ジャパンディスプレイウェスト | Image display device, electronic device, portable device, and image display method |
KR100739067B1 (en) | 2005-11-30 | 2007-07-12 | 삼성에스디아이 주식회사 | Three-dimensional display device |
US8493440B2 (en) * | 2005-12-20 | 2013-07-23 | Koninklijke Philips N.V. | Autostereoscopic display device |
US20070153122A1 (en) | 2005-12-30 | 2007-07-05 | Ayite Nii A | Apparatus and method for simultaneous multiple video channel viewing |
US8233034B2 (en) | 2006-02-10 | 2012-07-31 | Reald Inc. | Multi-functional active matrix liquid crystal displays |
US20070225994A1 (en) | 2006-03-17 | 2007-09-27 | Moore Barrett H | Method for Providing Private Civil Security Services Bundled with Second Party Products |
US8310533B2 (en) | 2006-03-27 | 2012-11-13 | GE Sensing & Inspection Technologies, LP | Inspection apparatus for inspecting articles |
US8466954B2 (en) | 2006-04-03 | 2013-06-18 | Sony Computer Entertainment Inc. | Screen sharing method and apparatus |
KR100893616B1 (en) | 2006-04-17 | 2009-04-20 | 삼성모바일디스플레이주식회사 | Electronic imaging device, 2d/3d image display device and the driving method thereof |
TWI378747B (en) | 2006-08-18 | 2012-12-01 | Ind Tech Res Inst | Flexible electronic assembly |
US20110090413A1 (en) | 2006-08-18 | 2011-04-21 | Industrial Technology Research Institute | 3-dimensional image display |
US7844547B2 (en) | 2006-08-21 | 2010-11-30 | Carl Raymond Amos | Uncle gem IV, universal automatic instant money, data and precious metal and stone transfer machine |
WO2008038068A1 (en) | 2006-09-25 | 2008-04-03 | Nokia Corporation | Supporting a 3d presentation |
JP4669482B2 (en) * | 2006-09-29 | 2011-04-13 | セイコーエプソン株式会社 | Display device, image processing method, and electronic apparatus |
US20080086685A1 (en) | 2006-10-05 | 2008-04-10 | James Janky | Method for delivering tailored asset information to a device |
US8645176B2 (en) | 2006-10-05 | 2014-02-04 | Trimble Navigation Limited | Utilizing historical data in an asset management environment |
US20080086391A1 (en) | 2006-10-05 | 2008-04-10 | Kurt Maynard | Impromptu asset tracking |
US7640223B2 (en) | 2006-11-16 | 2009-12-29 | University Of Tennessee Research Foundation | Method of organizing and presenting data in a table using stutter peak rule |
US7586681B2 (en) | 2006-11-29 | 2009-09-08 | Honeywell International Inc. | Directional display |
TW200834151A (en) | 2006-11-30 | 2008-08-16 | Westar Display Technologies Inc | Motion artifact measurement for display devices |
JP4285532B2 (en) | 2006-12-01 | 2009-06-24 | ソニー株式会社 | Backlight control device, backlight control method, and liquid crystal display device |
US8248462B2 (en) * | 2006-12-15 | 2012-08-21 | The Board Of Trustees Of The University Of Illinois | Dynamic parallax barrier autosteroscopic display system and method |
JP4686795B2 (en) * | 2006-12-27 | 2011-05-25 | 富士フイルム株式会社 | Image generating apparatus and image reproducing apparatus |
US7924456B1 (en) | 2007-01-12 | 2011-04-12 | Broadbus Technologies, Inc. | Data distribution and buffering |
CN101013559A (en) | 2007-01-30 | 2007-08-08 | 京东方科技集团股份有限公司 | LED brightness control circuit and backlight of LCD |
JP4255032B2 (en) | 2007-03-15 | 2009-04-15 | 富士通テン株式会社 | Display device and display method |
US7917853B2 (en) | 2007-03-21 | 2011-03-29 | At&T Intellectual Property I, L.P. | System and method of presenting media content |
US8269822B2 (en) | 2007-04-03 | 2012-09-18 | Sony Computer Entertainment America, LLC | Display viewing system and methods for optimizing display view based on active tracking |
US8600932B2 (en) | 2007-05-07 | 2013-12-03 | Trimble Navigation Limited | Telematic asset microfluidic analysis |
GB0709134D0 (en) * | 2007-05-11 | 2007-06-20 | Surman Philip | Multi-user autostereoscopic Display |
GB0709411D0 (en) | 2007-05-16 | 2007-06-27 | Barco Nv | Methods and systems for stereoscopic imaging |
TWI466093B (en) | 2007-06-26 | 2014-12-21 | Apple Inc | Management techniques for video playback |
KR101400285B1 (en) * | 2007-08-03 | 2014-05-30 | 삼성전자주식회사 | Front light unit and flat display apparatus employing the same |
US7911442B2 (en) | 2007-08-27 | 2011-03-22 | Au Optronics Corporation | Dynamic color gamut of LED backlight |
KR101362647B1 (en) | 2007-09-07 | 2014-02-12 | 삼성전자주식회사 | System and method for generating and palying three dimensional image file including two dimensional image |
US7881976B2 (en) * | 2007-09-27 | 2011-02-01 | Virgin Mobile Usa, L.P. | Apparatus, methods and systems for discounted referral and recommendation of electronic content |
GB2453323A (en) | 2007-10-01 | 2009-04-08 | Sharp Kk | Flexible backlight arrangement and display |
TWI354115B (en) * | 2007-10-05 | 2011-12-11 | Ind Tech Res Inst | Three-dimensional display apparatus |
US8416247B2 (en) * | 2007-10-09 | 2013-04-09 | Sony Computer Entertaiment America Inc. | Increasing the number of advertising impressions in an interactive environment |
US8031175B2 (en) | 2008-04-21 | 2011-10-04 | Panasonic Corporation | Touch sensitive remote control system that detects hand size characteristics of user and adapts mapping to screen display |
US8121191B1 (en) | 2007-11-13 | 2012-02-21 | Harmonic Inc. | AVC to SVC transcoder |
JP4956520B2 (en) | 2007-11-13 | 2012-06-20 | ミツミ電機株式会社 | Backlight device and liquid crystal display device using the same |
KR101439845B1 (en) | 2007-11-16 | 2014-09-12 | 삼성전자주식회사 | Digital image processing apparatus |
US20090138280A1 (en) | 2007-11-26 | 2009-05-28 | The General Electric Company | Multi-stepped default display protocols |
JP5236938B2 (en) | 2007-12-03 | 2013-07-17 | パナソニック株式会社 | Digital broadcast receiving apparatus, semiconductor integrated circuit, and digital broadcast receiving method |
TWI365302B (en) * | 2007-12-31 | 2012-06-01 | Ind Tech Res Inst | Stereo image display with switch function between horizontal display and vertical display |
US8339333B2 (en) | 2008-01-02 | 2012-12-25 | 3M Innovative Properties Company | Methods of reducing perceived image crosstalk in a multiview display |
WO2009098622A2 (en) | 2008-02-08 | 2009-08-13 | Koninklijke Philips Electronics N.V. | Autostereoscopic display device |
KR101451565B1 (en) | 2008-02-13 | 2014-10-16 | 삼성전자 주식회사 | Autostereoscopic display system |
JP5642347B2 (en) | 2008-03-07 | 2014-12-17 | ミツミ電機株式会社 | LCD backlight device |
KR101488199B1 (en) * | 2008-03-12 | 2015-01-30 | 삼성전자주식회사 | Method and apparatus for processing and reproducing image, and computer readable medium thereof |
US20090237492A1 (en) | 2008-03-18 | 2009-09-24 | Invism, Inc. | Enhanced stereoscopic immersive video recording and viewing |
US20090244266A1 (en) | 2008-03-26 | 2009-10-01 | Thomas Carl Brigham | Enhanced Three Dimensional Television |
JP4925354B2 (en) | 2008-03-31 | 2012-04-25 | 富士フイルム株式会社 | Image processing apparatus, image display apparatus, imaging apparatus, and image processing method |
GB0806183D0 (en) | 2008-04-04 | 2008-05-14 | Picsel Res Ltd | Presentation of objects in 3D displays |
US20090282429A1 (en) * | 2008-05-07 | 2009-11-12 | Sony Ericsson Mobile Communications Ab | Viewer tracking for displaying three dimensional views |
DE102008001644B4 (en) | 2008-05-08 | 2010-03-04 | Seereal Technologies S.A. | Device for displaying three-dimensional images |
US20090295791A1 (en) | 2008-05-29 | 2009-12-03 | Microsoft Corporation | Three-dimensional environment created from video |
CN101291415B (en) | 2008-05-30 | 2010-07-21 | 华为终端有限公司 | Method, apparatus and system for three-dimensional video communication |
US20090319625A1 (en) | 2008-06-20 | 2009-12-24 | Alcatel Lucent | Interactivity in a digital public signage network architecture |
TWI401658B (en) | 2008-07-18 | 2013-07-11 | Hannstar Display Corp | Gate line driving circuit of lcd panel |
US20100070987A1 (en) | 2008-09-12 | 2010-03-18 | At&T Intellectual Property I, L.P. | Mining viewer responses to multimedia content |
JP2010074557A (en) | 2008-09-18 | 2010-04-02 | Toshiba Corp | Television receiver |
CN101861735B (en) | 2008-09-18 | 2013-08-21 | 松下电器产业株式会社 | Image decoding device, image encoding device, image decoding method, image encoding method |
KR20100033067A (en) | 2008-09-19 | 2010-03-29 | 삼성전자주식회사 | Image display apparatus and method for both 2d and 3d image |
KR101497511B1 (en) | 2008-09-19 | 2015-03-02 | 삼성전자주식회사 | APPARATUS FOR MULTIPLEXING 2 DIMENSIONAL and 3 DIMENSIONAL IMAGE AND VIDEO |
EP2395770A3 (en) | 2008-09-30 | 2013-09-25 | Panasonic Corporation | Recording medium, playback device, integrated circuit, playback method |
US20100107184A1 (en) | 2008-10-23 | 2010-04-29 | Peter Rae Shintani | TV with eye detection |
US8752087B2 (en) | 2008-11-07 | 2014-06-10 | At&T Intellectual Property I, L.P. | System and method for dynamically constructing personalized contextual video programs |
KR20110097879A (en) | 2008-11-24 | 2011-08-31 | 코닌클리케 필립스 일렉트로닉스 엔.브이. | Combining 3d video and auxiliary data |
US8103608B2 (en) | 2008-11-26 | 2012-01-24 | Microsoft Corporation | Reference model for data-driven analytics |
US20100128112A1 (en) | 2008-11-26 | 2010-05-27 | Samsung Electronics Co., Ltd | Immersive display system for interacting with three-dimensional content |
US20100135640A1 (en) | 2008-12-03 | 2010-06-03 | Dell Products L.P. | System and Method for Storing and Displaying 3-D Video Content |
US8209396B1 (en) | 2008-12-10 | 2012-06-26 | Howcast Media, Inc. | Video player |
CN102272778B (en) | 2009-01-07 | 2015-05-20 | 汤姆森特许公司 | Joint depth estimation |
WO2010095381A1 (en) | 2009-02-20 | 2010-08-26 | パナソニック株式会社 | Recording medium, reproduction device, and integrated circuit |
WO2010095440A1 (en) | 2009-02-20 | 2010-08-26 | パナソニック株式会社 | Recording medium, reproduction device, and integrated circuit |
US9565397B2 (en) | 2009-02-26 | 2017-02-07 | Akamai Technologies, Inc. | Deterministically skewing transmission of content streams |
US20100225576A1 (en) | 2009-03-03 | 2010-09-09 | Horizon Semiconductors Ltd. | Three-dimensional interactive system and method |
US8477175B2 (en) | 2009-03-09 | 2013-07-02 | Cisco Technology, Inc. | System and method for providing three dimensional imaging in a network environment |
US20100231511A1 (en) | 2009-03-10 | 2010-09-16 | David L. Henty | Interactive media system with multi-directional remote control and dual mode camera |
EP2409495A4 (en) | 2009-03-16 | 2013-02-06 | Lg Electronics Inc | A method of displaying three-dimensional image data and an apparatus of processing three-dimensional image data |
KR101427211B1 (en) * | 2009-03-27 | 2014-08-13 | 한국전자통신연구원 | Method and apparatus for generating and consuming multi-view image media file |
JP5695819B2 (en) | 2009-03-30 | 2015-04-08 | 日立マクセル株式会社 | TV operation method |
JP5542912B2 (en) | 2009-04-09 | 2014-07-09 | テレフオンアクチーボラゲット エル エム エリクソン(パブル) | Media container file management |
EP2425303B1 (en) | 2009-04-26 | 2019-01-16 | NIKE Innovate C.V. | Gps features and functionality in an athletic watch system |
US8315405B2 (en) | 2009-04-28 | 2012-11-20 | Bose Corporation | Coordinated ANR reference sound compression |
US8532310B2 (en) | 2010-03-30 | 2013-09-10 | Bose Corporation | Frequency-dependent ANR reference sound compression |
US20100280959A1 (en) | 2009-05-01 | 2010-11-04 | Darrel Stone | Real-time sourcing of service providers |
CN101983400B (en) | 2009-05-15 | 2013-07-17 | 株式会社东芝 | Image display device |
US8788676B2 (en) | 2009-05-22 | 2014-07-22 | Motorola Mobility Llc | Method and system for controlling data transmission to or from a mobile device |
US8704958B2 (en) | 2009-06-01 | 2014-04-22 | Lg Electronics Inc. | Image display device and operation method thereof |
US9237296B2 (en) | 2009-06-01 | 2016-01-12 | Lg Electronics Inc. | Image display apparatus and operating method thereof |
US20100309290A1 (en) | 2009-06-08 | 2010-12-09 | Stephen Brooks Myers | System for capture and display of stereoscopic content |
WO2010143820A2 (en) | 2009-06-08 | 2010-12-16 | 엘지전자 주식회사 | Device and method for providing a three-dimensional pip image |
US8411746B2 (en) | 2009-06-12 | 2013-04-02 | Qualcomm Incorporated | Multiview video coding over MPEG-2 systems |
US20100321465A1 (en) | 2009-06-19 | 2010-12-23 | Dominique A Behrens Pa | Method, System and Computer Program Product for Mobile Telepresence Interactions |
CN102713738B (en) | 2009-08-07 | 2016-01-27 | 瑞尔D股份有限公司 | There is the stereoscopic flat panel display of continuous illumination backlight |
US8976871B2 (en) | 2009-09-16 | 2015-03-10 | Qualcomm Incorporated | Media extractor tracks for file format track selection |
US8446462B2 (en) | 2009-10-15 | 2013-05-21 | At&T Intellectual Property I, L.P. | Method and system for time-multiplexed shared display |
US20110093882A1 (en) | 2009-10-21 | 2011-04-21 | Candelore Brant L | Parental control through the HDMI interface |
KR101600818B1 (en) * | 2009-11-06 | 2016-03-09 | 삼성디스플레이 주식회사 | 3 three dimensional optical module and display device including the same |
US8705624B2 (en) | 2009-11-24 | 2014-04-22 | STMicroelectronics International N. V. | Parallel decoding for scalable video coding |
US8335763B2 (en) | 2009-12-04 | 2012-12-18 | Microsoft Corporation | Concurrently presented data subfeeds |
US8462197B2 (en) | 2009-12-17 | 2013-06-11 | Motorola Mobility Llc | 3D video transforming device |
US20110153362A1 (en) | 2009-12-17 | 2011-06-23 | Valin David A | Method and mechanism for identifying protecting, requesting, assisting and managing information |
US8823782B2 (en) | 2009-12-31 | 2014-09-02 | Broadcom Corporation | Remote control with integrated position, viewer identification and optical and audio test |
US9247286B2 (en) | 2009-12-31 | 2016-01-26 | Broadcom Corporation | Frame formatting supporting mixed two and three dimensional video data communication |
US8854531B2 (en) | 2009-12-31 | 2014-10-07 | Broadcom Corporation | Multiple remote controllers that each simultaneously controls a different visual presentation of a 2D/3D display |
US20110157322A1 (en) | 2009-12-31 | 2011-06-30 | Broadcom Corporation | Controlling a pixel array to support an adaptable light manipulator |
US8384774B2 (en) | 2010-02-15 | 2013-02-26 | Eastman Kodak Company | Glasses for viewing stereo images |
US20110199469A1 (en) * | 2010-02-15 | 2011-08-18 | Gallagher Andrew C | Detection and display of stereo images |
KR101356248B1 (en) | 2010-02-19 | 2014-01-29 | 엘지디스플레이 주식회사 | Image display device |
US9285589B2 (en) | 2010-02-28 | 2016-03-15 | Microsoft Technology Licensing, Llc | AR glasses with event and sensor triggered control of AR eyepiece applications |
US9129295B2 (en) | 2010-02-28 | 2015-09-08 | Microsoft Technology Licensing, Llc | See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear |
US8964298B2 (en) | 2010-02-28 | 2015-02-24 | Microsoft Corporation | Video display modification based on sensor input for a see-through near-to-eye display |
KR101324412B1 (en) | 2010-05-06 | 2013-11-01 | 엘지디스플레이 주식회사 | Stereoscopic image display and driving method thereof |
JPWO2011142141A1 (en) | 2010-05-13 | 2013-07-22 | パナソニック株式会社 | Display device and video viewing system |
KR101255711B1 (en) | 2010-07-02 | 2013-04-17 | 엘지디스플레이 주식회사 | 3d image display device and driving method thereof |
US8605136B2 (en) | 2010-08-10 | 2013-12-10 | Sony Corporation | 2D to 3D user interface content data conversion |
US8363928B1 (en) | 2010-12-24 | 2013-01-29 | Trimble Navigation Ltd. | General orientation positioning system |
WO2012132797A1 (en) * | 2011-03-31 | 2012-10-04 | 富士フイルム株式会社 | Image capturing device and image capturing method |
WO2013078317A1 (en) * | 2011-11-21 | 2013-05-30 | Schlumberger Technology Corporation | Interface for controlling and improving drilling operations |
-
2010
- 2010-05-05 US US12/774,225 patent/US20110157322A1/en not_active Abandoned
- 2010-05-05 US US12/774,307 patent/US8964013B2/en active Active
- 2010-07-28 US US12/845,440 patent/US20110157697A1/en not_active Abandoned
- 2010-07-28 US US12/845,409 patent/US20110157696A1/en not_active Abandoned
- 2010-07-28 US US12/845,461 patent/US8767050B2/en active Active
- 2010-12-22 EP EP20100015980 patent/EP2357508A1/en not_active Withdrawn
- 2010-12-22 EP EP20100015984 patent/EP2357630A1/en not_active Ceased
- 2010-12-23 EP EP20100016055 patent/EP2357631A1/en not_active Ceased
- 2010-12-30 US US12/982,309 patent/US9204138B2/en active Active
- 2010-12-30 US US12/982,088 patent/US9066092B2/en active Active
- 2010-12-30 TW TW99146883A patent/TW201142356A/en unknown
- 2010-12-30 CN CN2010106160608A patent/CN102183840A/en active Pending
- 2010-12-30 US US12/982,053 patent/US20110157309A1/en not_active Abandoned
- 2010-12-30 US US12/982,248 patent/US20110157315A1/en not_active Abandoned
- 2010-12-30 US US12/982,140 patent/US20110161843A1/en not_active Abandoned
- 2010-12-30 US US12/982,199 patent/US8988506B2/en active Active
- 2010-12-30 TW TW99146892A patent/TWI467234B/en not_active IP Right Cessation
- 2010-12-30 US US12/982,124 patent/US9124885B2/en active Active
- 2010-12-30 US US12/982,047 patent/US20110157330A1/en not_active Abandoned
- 2010-12-30 US US12/982,377 patent/US20110157327A1/en not_active Abandoned
- 2010-12-30 US US12/982,330 patent/US20110157326A1/en not_active Abandoned
- 2010-12-30 US US12/982,020 patent/US20110157257A1/en not_active Abandoned
- 2010-12-30 US US12/982,069 patent/US8922545B2/en active Active
- 2010-12-30 US US12/982,156 patent/US9654767B2/en active Active
- 2010-12-30 US US12/982,212 patent/US9013546B2/en active Active
- 2010-12-30 US US12/982,273 patent/US9979954B2/en active Active
- 2010-12-30 US US12/982,173 patent/US9143770B2/en active Active
- 2010-12-30 US US12/982,062 patent/US8687042B2/en active Active
- 2010-12-30 US US12/982,031 patent/US9019263B2/en active Active
- 2010-12-30 US US12/982,362 patent/US9049440B2/en active Active
- 2010-12-30 EP EP20100016190 patent/EP2346021B1/en active Active
- 2010-12-31 CN CN201010616920.8A patent/CN102183841B/en active Active
- 2010-12-31 CN CN201010619646XA patent/CN102215408A/en active Pending
- 2010-12-31 TW TW99147124A patent/TW201142357A/en unknown
-
2012
- 2012-03-02 HK HK12102171A patent/HK1161754A1/en not_active IP Right Cessation
-
2014
- 2014-10-01 US US14/504,095 patent/US20150015668A1/en not_active Abandoned
-
2015
- 2015-02-06 US US14/616,130 patent/US20150156473A1/en not_active Abandoned
- 2015-05-28 US US14/723,922 patent/US20150264341A1/en not_active Abandoned
Patent Citations (100)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4829365A (en) * | 1986-03-07 | 1989-05-09 | Dimension Technologies, Inc. | Autostereoscopic display with illuminating lines, light valve and mask |
US20030154261A1 (en) * | 1994-10-17 | 2003-08-14 | The Regents Of The University Of California, A Corporation Of The State Of California | Distributed hypermedia method and system for automatically invoking external application providing interaction and display of embedded objects within a hypermedia document |
US20040036763A1 (en) * | 1994-11-14 | 2004-02-26 | Swift David C. | Intelligent method and system for producing and displaying stereoscopically-multiplexed images of three-dimensional objects for use in realistic stereoscopic viewing thereof in interactive virtual reality display environments |
US5615046A (en) * | 1995-01-23 | 1997-03-25 | Cyber Scientific Inc. | Stereoscopic viewing system |
US6094216A (en) * | 1995-05-22 | 2000-07-25 | Canon Kabushiki Kaisha | Stereoscopic image display method, and stereoscopic image display apparatus using the method |
US6909555B2 (en) * | 1995-06-07 | 2005-06-21 | Jacob N. Wohlstadter | Three dimensional imaging system |
US20040141237A1 (en) * | 1995-06-07 | 2004-07-22 | Wohlstadter Jacob N. | Three dimensional imaging system |
US5945965A (en) * | 1995-06-29 | 1999-08-31 | Canon Kabushiki Kaisha | Stereoscopic image display method |
US5959597A (en) * | 1995-09-28 | 1999-09-28 | Sony Corporation | Image/audio reproducing system |
US7123213B2 (en) * | 1995-10-05 | 2006-10-17 | Semiconductor Energy Laboratory Co., Ltd. | Three dimensional display unit and display method |
US6049424A (en) * | 1995-11-15 | 2000-04-11 | Sanyo Electric Co., Ltd. | Three dimensional display device |
US7190518B1 (en) * | 1996-01-22 | 2007-03-13 | 3Ality, Inc. | Systems for and methods of three dimensional viewing |
US7038698B1 (en) * | 1996-02-08 | 2006-05-02 | Palm Charles S | 3D stereo browser for the internet |
US6023277A (en) * | 1996-07-03 | 2000-02-08 | Canon Kabushiki Kaisha | Display control apparatus and method |
US5855425A (en) * | 1996-07-19 | 1999-01-05 | Sanyo Electric Co., Ltd. | Stereoscopic display |
US5969850A (en) * | 1996-09-27 | 1999-10-19 | Sharp Kabushiki Kaisha | Spatial light modulator, directional display and directional light source |
US5990975A (en) * | 1996-11-22 | 1999-11-23 | Acer Peripherals, Inc. | Dual screen displaying device |
US6285368B1 (en) * | 1997-02-10 | 2001-09-04 | Canon Kabushiki Kaisha | Image display system and image display apparatus and information processing apparatus in the system |
US7030903B2 (en) * | 1997-02-20 | 2006-04-18 | Canon Kabushiki Kaisha | Image display system, information processing apparatus, and method of controlling the same |
US6188442B1 (en) * | 1997-08-01 | 2001-02-13 | International Business Machines Corporation | Multiviewer display system for television monitors |
US6710920B1 (en) * | 1998-03-27 | 2004-03-23 | Sanyo Electric Co., Ltd | Stereoscopic display |
US6144375A (en) * | 1998-08-14 | 2000-11-07 | Praja Inc. | Multi-perspective viewer for content-based interactivity |
US6697687B1 (en) * | 1998-11-09 | 2004-02-24 | Hitachi, Ltd. | Image display apparatus having audio output control means in accordance with image signal type |
US20030012425A1 (en) * | 1998-11-12 | 2003-01-16 | Canon Kabushiki Kaisha | Viewpoint position detection apparatus and method, and stereoscopic image display system |
US20020171666A1 (en) * | 1999-02-19 | 2002-11-21 | Takaaki Endo | Image processing apparatus for interpolating and generating images from an arbitrary view point |
US20080184301A1 (en) * | 1999-10-29 | 2008-07-31 | Boylan Peter C | Interactive television system with programming-related links |
US20020010798A1 (en) * | 2000-04-20 | 2002-01-24 | Israel Ben-Shaul | Differentiated content and application delivery via internet |
US20030103165A1 (en) * | 2000-05-19 | 2003-06-05 | Werner Bullinger | System for operating a consumer electronics appaliance |
US20020037037A1 (en) * | 2000-09-22 | 2002-03-28 | Philips Electronics North America Corporation | Preferred transmission/streaming order of fine-granular scalability |
US20020167862A1 (en) * | 2001-04-03 | 2002-11-14 | Carlo Tomasi | Method and apparatus for approximating a source position of a sound-causing event for determining an input used in operating an electronic device |
US20040252187A1 (en) * | 2001-09-10 | 2004-12-16 | Alden Ray M. | Processes and apparatuses for efficient multiple program and 3D display |
US20060256136A1 (en) * | 2001-10-01 | 2006-11-16 | Adobe Systems Incorporated, A Delaware Corporation | Compositing two-dimensional and three-dimensional image layers |
US20030137506A1 (en) * | 2001-11-30 | 2003-07-24 | Daniel Efran | Image-based rendering for 3D viewing |
US20030223499A1 (en) * | 2002-04-09 | 2003-12-04 | Nicholas Routhier | Process and system for encoding and playback of stereoscopic video sequences |
US20050248561A1 (en) * | 2002-04-25 | 2005-11-10 | Norio Ito | Multimedia information generation method and multimedia information reproduction device |
US20050259147A1 (en) * | 2002-07-16 | 2005-11-24 | Nam Jeho | Apparatus and method for adapting 2d and 3d stereoscopic video signal |
US20040027452A1 (en) * | 2002-08-07 | 2004-02-12 | Yun Kug Jin | Method and apparatus for multiplexing multi-view three-dimensional moving picture |
US20040041747A1 (en) * | 2002-08-27 | 2004-03-04 | Nec Corporation | 3D image/2D image switching display apparatus and portable terminal device |
US20040239231A1 (en) * | 2002-10-30 | 2004-12-02 | Keisuke Miyagawa | Display device and electronic equipment |
US20040109093A1 (en) * | 2002-12-05 | 2004-06-10 | Small-Stryker Aaron Tug | Method and apparatus for simultaneous television video presentation and separate viewing of different broadcasts |
US20080284844A1 (en) * | 2003-02-05 | 2008-11-20 | Graham John Woodgate | Switchable Lens |
US20040164292A1 (en) * | 2003-02-21 | 2004-08-26 | Yeh-Jiun Tung | Transflective display having an OLED backlight |
US20070097208A1 (en) * | 2003-05-28 | 2007-05-03 | Satoshi Takemoto | Stereoscopic image display apparatus, text data processing apparatus, program, and storing medium |
US20050073472A1 (en) * | 2003-07-26 | 2005-04-07 | Samsung Electronics Co., Ltd. | Method of removing Moire pattern in 3D image display apparatus using complete parallax |
US20070097103A1 (en) * | 2003-09-11 | 2007-05-03 | Shoji Yoshioka | Portable display device |
US20070085814A1 (en) * | 2003-09-20 | 2007-04-19 | Koninklijke Philips Electronics N.V. | Image display device |
US20080273242A1 (en) * | 2003-09-30 | 2008-11-06 | Graham John Woodgate | Directional Display Apparatus |
US20050128353A1 (en) * | 2003-12-16 | 2005-06-16 | Young Bruce A. | System and method for using second remote control device for sub-picture control in television receiver |
US7091471B2 (en) * | 2004-03-15 | 2006-08-15 | Agilent Technologies, Inc. | Using eye detection for providing control and power management of electronic devices |
US20050237487A1 (en) * | 2004-04-23 | 2005-10-27 | Chang Nelson L A | Color wheel assembly for stereoscopic imaging |
US7440193B2 (en) * | 2004-04-30 | 2008-10-21 | Gunasekaran R Alfred | Wide-angle variable focal length lens system |
US20060050785A1 (en) * | 2004-09-09 | 2006-03-09 | Nucore Technology Inc. | Inserting a high resolution still image into a lower resolution video stream |
US20090058845A1 (en) * | 2004-10-20 | 2009-03-05 | Yasuhiro Fukuda | Display device |
US20070296874A1 (en) * | 2004-10-20 | 2007-12-27 | Fujitsu Ten Limited | Display Device,Method of Adjusting the Image Quality of the Display Device, Device for Adjusting the Image Quality and Device for Adjusting the Contrast |
US20060109242A1 (en) * | 2004-11-19 | 2006-05-25 | Simpkins Daniel S | User interface for impaired users |
US20060139490A1 (en) * | 2004-12-15 | 2006-06-29 | Fekkes Wilhelmus F | Synchronizing audio with delayed video |
US20060139448A1 (en) * | 2004-12-29 | 2006-06-29 | Samsung Electronics Co., Ltd. | 3D displays with flexible switching capability of 2D/3D viewing modes |
US20090115800A1 (en) * | 2005-01-18 | 2009-05-07 | Koninklijke Philips Electronics, N.V. | Multi-view display device |
US20080192112A1 (en) * | 2005-03-18 | 2008-08-14 | Ntt Data Sanyo System Corporation | Stereoscopic Image Display Apparatus, Stereoscopic Image Displaying Method And Computer Program Product |
US20070139371A1 (en) * | 2005-04-04 | 2007-06-21 | Harsham Bret A | Control system and method for differentiating multiple users utilizing multi-view display devices |
US20080191964A1 (en) * | 2005-04-22 | 2008-08-14 | Koninklijke Philips Electronics, N.V. | Auto-Stereoscopic Display With Mixed Mode For Concurrent Display of Two- and Three-Dimensional Images |
US20080246757A1 (en) * | 2005-04-25 | 2008-10-09 | Masahiro Ito | 3D Image Generation and Display System |
US20090102915A1 (en) * | 2005-04-25 | 2009-04-23 | Svyatoslav Ivanovich Arsenich | Stereoprojection system |
US20060244918A1 (en) * | 2005-04-27 | 2006-11-02 | Actuality Systems, Inc. | Minimized-thickness angular scanner of electromagnetic radiation |
US20060256302A1 (en) * | 2005-05-13 | 2006-11-16 | Microsoft Corporation | Three-dimensional (3D) image projection |
US20090051759A1 (en) * | 2005-05-27 | 2009-02-26 | Adkins Sean M | Equipment and methods for the synchronization of stereoscopic projection displays |
US20060271791A1 (en) * | 2005-05-27 | 2006-11-30 | Sbc Knowledge Ventures, L.P. | Method and system for biometric based access control of media content presentation devices |
US20070096125A1 (en) * | 2005-06-24 | 2007-05-03 | Uwe Vogel | Illumination device |
US20070002041A1 (en) * | 2005-07-02 | 2007-01-04 | Samsung Electronics Co., Ltd. | Method and apparatus for encoding/decoding video data to implement local three-dimensional video |
US20070008406A1 (en) * | 2005-07-08 | 2007-01-11 | Samsung Electronics Co., Ltd. | High resolution 2D-3D switchable autostereoscopic display apparatus |
US20070008620A1 (en) * | 2005-07-11 | 2007-01-11 | Samsung Electronics Co., Ltd. | Switchable autostereoscopic display |
US20070052807A1 (en) * | 2005-09-07 | 2007-03-08 | Fuji Xerox Co., Ltd. | System and method for user monitoring interface of 3-D video streams from multiple cameras |
US20070072674A1 (en) * | 2005-09-12 | 2007-03-29 | Nintendo Co., Ltd. | Information processing program |
US7511774B2 (en) * | 2005-11-30 | 2009-03-31 | Samsung Mobile Display Co., Ltd. | Three-dimensional display device |
US20080259233A1 (en) * | 2005-12-20 | 2008-10-23 | Koninklijke Philips Electronics, N.V. | Autostereoscopic Display Device |
US20070146267A1 (en) * | 2005-12-22 | 2007-06-28 | Lg.Philips Lcd Co., Ltd. | Display device and method of driving the same |
US20070147827A1 (en) * | 2005-12-28 | 2007-06-28 | Arnold Sheynman | Methods and apparatus for wireless stereo video streaming |
US20070153916A1 (en) * | 2005-12-30 | 2007-07-05 | Sharp Laboratories Of America, Inc. | Wireless video transmission system |
US20070162392A1 (en) * | 2006-01-12 | 2007-07-12 | Microsoft Corporation | Management of Streaming Content |
US7359105B2 (en) * | 2006-02-07 | 2008-04-15 | Sharp Kabushiki Kaisha | Spatial light modulator and a display device |
US20090010264A1 (en) * | 2006-03-21 | 2009-01-08 | Huawei Technologies Co., Ltd. | Method and System for Ensuring QoS and SLA Server |
US20080133122A1 (en) * | 2006-03-29 | 2008-06-05 | Sanyo Electric Co., Ltd. | Multiple visual display device and vehicle-mounted navigation system |
US20080043096A1 (en) * | 2006-04-04 | 2008-02-21 | Anthony Vetro | Method and System for Decoding and Displaying 3D Light Fields |
US20070258140A1 (en) * | 2006-05-04 | 2007-11-08 | Samsung Electronics Co., Ltd. | Multiview autostereoscopic display |
US20070270218A1 (en) * | 2006-05-08 | 2007-11-22 | Nintendo Co., Ltd. | Storage medium having game program stored thereon and game apparatus |
US20080025390A1 (en) * | 2006-07-25 | 2008-01-31 | Fang Shi | Adaptive video frame interpolation |
US20080037120A1 (en) * | 2006-08-08 | 2008-02-14 | Samsung Electronics Co., Ltd | High resolution 2d/3d switchable display apparatus |
US20080043644A1 (en) * | 2006-08-18 | 2008-02-21 | Microsoft Corporation | Techniques to perform rate matching for multimedia conference calls |
US20080126557A1 (en) * | 2006-09-08 | 2008-05-29 | Tetsuro Motoyama | System, method, and computer program product using an SNMP implementation to obtain vendor information from remote devices |
US20080068329A1 (en) * | 2006-09-15 | 2008-03-20 | Samsung Electronics Co., Ltd. | Multi-view autostereoscopic display with improved resolution |
US20080165176A1 (en) * | 2006-09-28 | 2008-07-10 | Charles Jens Archer | Method of Video Display and Multiplayer Gaming |
US20080150853A1 (en) * | 2006-12-22 | 2008-06-26 | Hong Kong Applied Science and Technology Research Institute Company Limited | Backlight device and liquid crystal display incorporating the backlight device |
US20080168129A1 (en) * | 2007-01-08 | 2008-07-10 | Jeffrey Robbin | Pairing a Media Server and a Media Client |
US20080303832A1 (en) * | 2007-06-11 | 2008-12-11 | Samsung Electronics Co., Ltd. | Method of generating two-dimensional/three-dimensional convertible stereoscopic image bitstream and method and apparatus for displaying the same |
US20090002178A1 (en) * | 2007-06-29 | 2009-01-01 | Microsoft Corporation | Dynamic mood sensing |
US20090052164A1 (en) * | 2007-08-24 | 2009-02-26 | Masako Kashiwagi | Directional backlight, display apparatus, and stereoscopic display apparatus |
US20090115783A1 (en) * | 2007-11-02 | 2009-05-07 | Dimension Technologies, Inc. | 3d optical illusions from off-axis displays |
US20090133051A1 (en) * | 2007-11-21 | 2009-05-21 | Gesturetek, Inc. | Device access control |
US20090138805A1 (en) * | 2007-11-21 | 2009-05-28 | Gesturetek, Inc. | Media preferences |
US20100045782A1 (en) * | 2008-08-25 | 2010-02-25 | Chihiro Morita | Content reproducing apparatus and method |
Cited By (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120194676A1 (en) * | 2009-10-07 | 2012-08-02 | Robert Laganiere | Video analytics method and system |
US9420250B2 (en) * | 2009-10-07 | 2016-08-16 | Robert Laganiere | Video analytics method and system |
US9049440B2 (en) | 2009-12-31 | 2015-06-02 | Broadcom Corporation | Independent viewer tailoring of same media source content via a common 2D-3D display |
US9019263B2 (en) | 2009-12-31 | 2015-04-28 | Broadcom Corporation | Coordinated driving of adaptable light manipulator, backlighting and pixel array in support of adaptable 2D and 3D displays |
US9654767B2 (en) | 2009-12-31 | 2017-05-16 | Avago Technologies General Ip (Singapore) Pte. Ltd. | Programming architecture supporting mixed two and three dimensional displays |
US9124885B2 (en) | 2009-12-31 | 2015-09-01 | Broadcom Corporation | Operating system supporting mixed 2D, stereoscopic 3D and multi-view 3D displays |
US8687042B2 (en) | 2009-12-31 | 2014-04-01 | Broadcom Corporation | Set-top box circuitry supporting 2D and 3D content reductions to accommodate viewing environment constraints |
US8767050B2 (en) | 2009-12-31 | 2014-07-01 | Broadcom Corporation | Display supporting multiple simultaneous 3D views |
US8823782B2 (en) | 2009-12-31 | 2014-09-02 | Broadcom Corporation | Remote control with integrated position, viewer identification and optical and audio test |
US8854531B2 (en) | 2009-12-31 | 2014-10-07 | Broadcom Corporation | Multiple remote controllers that each simultaneously controls a different visual presentation of a 2D/3D display |
US8922545B2 (en) | 2009-12-31 | 2014-12-30 | Broadcom Corporation | Three-dimensional display system with adaptation based on viewing reference of viewer(s) |
US8964013B2 (en) | 2009-12-31 | 2015-02-24 | Broadcom Corporation | Display with elastic light manipulator |
US8988506B2 (en) | 2009-12-31 | 2015-03-24 | Broadcom Corporation | Transcoder supporting selective delivery of 2D, stereoscopic 3D, and multi-view 3D content from source video |
US9143770B2 (en) | 2009-12-31 | 2015-09-22 | Broadcom Corporation | Application programming interface supporting mixed two and three dimensional displays |
US9013546B2 (en) | 2009-12-31 | 2015-04-21 | Broadcom Corporation | Adaptable media stream servicing two and three dimensional content |
US20110164115A1 (en) * | 2009-12-31 | 2011-07-07 | Broadcom Corporation | Transcoder supporting selective delivery of 2d, stereoscopic 3d, and multi-view 3d content from source video |
US9247286B2 (en) | 2009-12-31 | 2016-01-26 | Broadcom Corporation | Frame formatting supporting mixed two and three dimensional video data communication |
US9204138B2 (en) | 2009-12-31 | 2015-12-01 | Broadcom Corporation | User controlled regional display of mixed two and three dimensional content |
US20110164188A1 (en) * | 2009-12-31 | 2011-07-07 | Broadcom Corporation | Remote control with integrated position, viewer identification and optical and audio test |
US9066092B2 (en) | 2009-12-31 | 2015-06-23 | Broadcom Corporation | Communication infrastructure including simultaneous video pathways for multi-viewer support |
US9979954B2 (en) | 2009-12-31 | 2018-05-22 | Avago Technologies General Ip (Singapore) Pte. Ltd. | Eyewear with time shared viewing supporting delivery of differing content to multiple viewers |
US20150138323A1 (en) * | 2011-08-03 | 2015-05-21 | Sony Corporation | Image processing device and method, and program |
US20130033576A1 (en) * | 2011-08-03 | 2013-02-07 | Myokan Yoshihiro | Image processing device and method, and program |
US9019346B2 (en) * | 2011-08-03 | 2015-04-28 | Sony Corporation | Image processing device and method, and program |
US9497441B2 (en) * | 2011-08-03 | 2016-11-15 | Sony Corporation | Image processing device and method, and program |
US10659724B2 (en) * | 2011-08-24 | 2020-05-19 | Ati Technologies Ulc | Method and apparatus for providing dropped picture image processing |
US20130050572A1 (en) * | 2011-08-24 | 2013-02-28 | Ati Technologies Ulc | Method and apparatus for providing dropped picture image processing |
US9667919B2 (en) | 2012-08-02 | 2017-05-30 | Iwatchlife Inc. | Method and system for anonymous video analytics processing |
US9160963B2 (en) * | 2012-08-08 | 2015-10-13 | Samsung Electronics Co., Ltd | Terminal and method for generating live image |
US20140044412A1 (en) * | 2012-08-08 | 2014-02-13 | Samsung Electronics Co., Ltd. | Terminal and method for generating live image |
US9591295B2 (en) * | 2013-09-24 | 2017-03-07 | Amazon Technologies, Inc. | Approaches for simulating three-dimensional views |
US20150085076A1 (en) * | 2013-09-24 | 2015-03-26 | Amazon Techologies, Inc. | Approaches for simulating three-dimensional views |
US20160255322A1 (en) * | 2013-10-07 | 2016-09-01 | Vid Scale, Inc. | User adaptive 3d video rendering and delivery |
US10802324B2 (en) | 2017-03-14 | 2020-10-13 | Boe Technology Group Co., Ltd. | Double vision display method and device |
US10375375B2 (en) * | 2017-05-15 | 2019-08-06 | Lg Electronics Inc. | Method of providing fixed region information or offset region information for subtitle in virtual reality system and device for controlling the same |
US10666922B2 (en) | 2017-05-15 | 2020-05-26 | Lg Electronics Inc. | Method of transmitting 360-degree video, method of receiving 360-degree video, device for transmitting 360-degree video, and device for receiving 360-degree video |
US10757392B2 (en) | 2017-05-15 | 2020-08-25 | Lg Electronics Inc. | Method of transmitting 360-degree video, method of receiving 360-degree video, device for transmitting 360-degree video, and device for receiving 360-degree video |
US11109013B2 (en) | 2017-05-15 | 2021-08-31 | Lg Electronics Inc. | Method of transmitting 360-degree video, method of receiving 360-degree video, device for transmitting 360-degree video, and device for receiving 360-degree video |
US10757324B2 (en) | 2018-08-03 | 2020-08-25 | Semiconductor Components Industries, Llc | Transform processors for gradually switching between image transforms |
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20110157315A1 (en) | Interpolation of three-dimensional video content | |
US10051257B2 (en) | 3D image reproduction device and method capable of selecting 3D mode for 3D image | |
CN102428706B (en) | Entry points for 3D trickplay | |
WO2012017643A1 (en) | Encoding method, display device, and decoding method | |
CN102197655B (en) | Stereoscopic image reproduction method in case of pause mode and stereoscopic image reproduction apparatus using same | |
US20100103168A1 (en) | Methods and apparatuses for processing and displaying image | |
KR20110129903A (en) | Transferring of 3d viewer metadata | |
TW201010409A (en) | Versatile 3-D picture format | |
CN102484738A (en) | 3d screen size compensation | |
US10037335B1 (en) | Detection of 3-D videos | |
EP2586210A1 (en) | Method for decoding 2d-compatible stereoscopic video flows | |
CN103563363A (en) | Automatic conversion of a stereoscopic image in order to allow a simultaneous stereoscopic and monoscopic display of said image | |
WO2016010708A1 (en) | Adaptive stereo scaling format switch for 3d video encoding | |
JP5415217B2 (en) | 3D image processing device | |
EP2378778A1 (en) | 3d lcd using spectrum method and 3d image display apparatus using the same | |
CN102144395A (en) | Stereoscopic image reproduction method in quick search mode and stereoscopic image reproduction apparatus using same | |
Minoli | 3D television (3DTV) technology, systems, and deployment: Rolling out the infrastructure for next-generation entertainment | |
CN116320506A (en) | Stereoscopic interaction service management method for film and television videos | |
Grau et al. | 3D-TV R&D activities in europe | |
US20110228058A1 (en) | Reproducing device, reproduction control method and program | |
KR101674688B1 (en) | A method for displaying a stereoscopic image and stereoscopic image playing device | |
EP2587812A2 (en) | Method for configuring stereoscopic moving picture file | |
JP7300294B2 (en) | 3D image transmission system, 3D image transmission device and 3D image display device | |
US20140078255A1 (en) | Reproduction device, reproduction method, and program | |
Gutiérrez Sánchez | Analysis of quality of experience in 3D video systems |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH CAROLINA Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001 Effective date: 20160201 Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001 Effective date: 20160201 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD., SINGAPORE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001 Effective date: 20170120 Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001 Effective date: 20170120 |
|
AS | Assignment |
Owner name: BROADCOM CORPORATION, CALIFORNIA Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041712/0001 Effective date: 20170119 |