US20060152803A1 - Enhancement of depth perception - Google Patents

Enhancement of depth perception Download PDF

Info

Publication number
US20060152803A1
US20060152803A1 US11/033,186 US3318605A US2006152803A1 US 20060152803 A1 US20060152803 A1 US 20060152803A1 US 3318605 A US3318605 A US 3318605A US 2006152803 A1 US2006152803 A1 US 2006152803A1
Authority
US
United States
Prior art keywords
viewer
dimensional image
enhancement
depth perception
depth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US11/033,186
Other versions
US7073908B1 (en
Inventor
Anthony Provitola
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US11/033,186 priority Critical patent/US7073908B1/en
Application filed by Individual filed Critical Individual
Priority to EA200701457A priority patent/EA013779B1/en
Priority to EP05856050A priority patent/EP1836531A4/en
Priority to MX2007008424A priority patent/MX2007008424A/en
Priority to CNA2005800464005A priority patent/CN101133360A/en
Priority to JP2007551283A priority patent/JP2008527918A/en
Priority to CA002593243A priority patent/CA2593243A1/en
Priority to AU2005324379A priority patent/AU2005324379A1/en
Priority to PCT/US2005/047577 priority patent/WO2006076173A2/en
Priority to BRPI0518498-3A priority patent/BRPI0518498A2/en
Application granted granted Critical
Publication of US7073908B1 publication Critical patent/US7073908B1/en
Publication of US20060152803A1 publication Critical patent/US20060152803A1/en
Priority to ZA200705671A priority patent/ZA200705671B/en
Priority to CL200702009A priority patent/CL2007002009A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0081Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for altering, e.g. enlarging, the entrance or exit pupil

Definitions

  • the field of the invention pertains to devices, systems, and methods for enhancing the depth perception of a viewer in a two-dimensional image.
  • the present invention provides a device, method, and system for enhancing depth perception of the viewer in the viewing of two-dimensional images of all kinds, including photographs, posters, drawings and paintings, signs, etc. as well as television and motion pictures.
  • stereographic devices and methods which require two separate two-dimensional images of the same scene, sometimes combined, superimposed, or overlaid in a single two-dimensional image, and the extraction and reconstruction of binocular images of that scene for the viewer who has the capacity for stereopsis; and the present invention, which, by stimulating retinal disparity in the viewer, so enhances the perception of the monocular cues for depth in a single two-dimensional image as to convert the viewer's perception of such monocular depth cues to an experience of the fused binocular depth cue of stereo vision.
  • the stereographic devices have been well known for many years, while the principle underlying the present invention, although not yet completely understood, is newly presented in this application.
  • the subject of depth perception in viewing two-dimensional images as it applies to “depth perception of images on a television screen” has been discussed in Le May, U.S. Pat. No. 5,488,510, but not with respect to depth perception in two-dimensional images generally, such as photographs, posters, paintings, signs, etc.
  • the present invention is distinguishable from Le May, which uses a window screen type mesh in a device to be worn by a viewer through which a two-dimensional television image is viewed, and creates, according to its inventor, an “illusion”.
  • the present invention does not create an “illusion”, but provides the experience of the binocular fusion of retinally disparate images, and employs the viewer's capacity for stereopsis to enter the experience.
  • the present invention should also be distinguished from the well known effect that is observed with monocular viewing (with one eye) of a two-dimensional image with monocular depth cues against a flat background without such cues.
  • the same effect can also be observed by monocular viewing of a two-dimensional image at the end of an enclosed space. With such a viewing the monocular depth cues in the two-dimensional image become significantly pronounced, albeit seen with only one eye.
  • Such monocular viewing deprives the viewer of the accommodation reflex which occurs with binocular vision that gives the viewer the ability to accurately focus on the two-dimensional image.
  • the result is that although with such monocular viewing the monocular depth cues in the two-dimensional image have an effect greater than if viewed binocularly, the two-dimensional image cannot be seen with the same degree of focus as if seen binocularly.
  • the present invention induces a retinal disparity in the viewer that results in a fusion experience, and can be seen binocularly with the accurate focus of the accommodation reflex.
  • the accurate focus in turn heightens the fusion experience, and thus the enhancement of depth perception afforded by the present invention.
  • the method underlying the system and device for enhancing depth perception in a substantially two-dimensional image has as the principal element the induction of retinal disparity in the eyes of the viewer from binocular retinal images of a scene including the two-dimensional image, by the addition to the scene of a visually identifiable object in front of the two-dimensional image, to create a “combined scene”.
  • the monocular depth cues are interpreted as binocular depth cues with those of the visually identifiable object, so that the viewer experiences the enhancement of his or her depth perception in the two dimensional image.
  • the preferred visually identifiable object for the system and device is a frame, which sounds the two-dimensional image while obscuring its edges, placed within the depth of field of focus and/or Panum's fusional region of the two dimensional image.
  • the elements of the system and device may have many other features, such as illumination, shape, color, etc. that can add to the enhancement effect by combination and/or control with reference to the qualities of the two-dimensional image and the viewer's vision.
  • FIG. 1 is a perspective view of an example of the physical configuration of the system and device
  • the invention is a method, system and device for enhancing the depth perception in a substantially two-dimensional image, hereinafter referred to simply as “2D image”, by a viewer thereof.
  • the method underlying the system and device for enhancing depth perception in a substantially 2D image has as the principal element the induction of a slight but sufficient retinal disparity in the eyes of the viewer from binocular retinal images of a scene including the 2D image.
  • Such a disparity in the binocular retinal images of the scene results from the addition to the scene of a visually identifiable object (VIO) in front of the 2D image, so as to create what will hereinafter be referred to as a “combined scene”.
  • VIO visually identifiable object
  • the monocular depth cues are interpreted as binocular depth cues with those of the VIO, so that the viewer experiences the enhancement of his or her depth perception in the 2D image, “the enhancement effect”.
  • the 2D image is referred to as such regardless of the shape of the surface upon which the 2D image is represented in two dimensions. Therefore the surface upon which the 2D image is presented shall be referred to as the “image surface”, which may be flat or spherical or some other shape, and may be a surface in space which is not associated with a physical object.
  • the object upon which the “image surface” may exist shall hereinafter be referred to as the “image object”, and may be solid, liquid or gaseous.
  • the 2D image may displayed upon the image surface of an image object.
  • the 2D image may be of any kind, including photographs, posters, drawings, paintings, signs, television and computer images, and all forms of front and rear projection images, film or electronic, both static and motion; and may exist on all kinds of objects that present a surface, such as buildings, stretched canvas, concrete slabs, pools of liquid, gas clouds, television and computer monitors (such as CRT, LCD, plasma, and TFT displays), projection screens, etc.
  • Binocular visual field region of overlapping visibility for the two eyes.
  • Point of Fixation Point of Regard
  • Point or object on which the eyes are directed and one's sight is fixed.
  • Fovea Point on the retina on which are focused the rays coming from an object directly regarded.
  • Stereopsis Perception of depth produced by binocular retinal disparity within Panum's fusional region requiring properly functioning cells in the visual cortex; the ability to distinguish the relative distance of objects with an apparent physical displacement between the objects resulting from the lateral displacement of the eyes that provides two slightly different views of the same object (disparate images), which allows stereoscopic depth discrimination,
  • Monocular depth cues Visual cues which present information about the relative location of objects using one eye, which include: occlusion or interposition; aerial perspective (atmospheric perspective, aerial haze); linear perspective (convergence of parallel lines); texture gradients; shading and light; relative size; relative motion (monocular movement parallax); and familiar size.
  • Fusion Neural process in the visual cortex that brings the retinal images in the two eyes to form a single cyclopean image.
  • Panum's fusional region Region in visual space over which we perceive binocular single vision. (Outside Panum's fusional region physiological diplopia occurs.)
  • Panum's fusional area Area on the retina of one eye, any point on which, when stimulated simultaneously with a single specific slightly disparate point in the retina of the other eye, will give rise to a single fused image.
  • Retinal disparity results from retinally disparate points that are retinal points which give rise to different principal visual directions, which, when within Panum's fusional area (zone of single binocular vision), can be fused resulting in single vision.
  • Binocular retinal rivalry Alternating suppression of the two eyes resulting in alternating perception of the two retinal images.
  • the present invention and its underlying principle may be understood with reference to FIG. 1 .
  • the system and device for enhancement of depth perception includes a 2D image 1 which is displayed on an image surface 3 , which may be part of an image object 6 , and at least one visually identifiable object (VIO) 2 placed in front of the 2D image 1 , so as to present those elements to the viewer 9 in what will hereinafter be referred to as a “combined scene” 13 .
  • the preferred VIO 2 should have a visually identifiable boundary 7 which is relatively well defined and sharp, rather than ill-defined and fuzzy.
  • the VIO 2 may obscure a part 11 of the 2D image 1 to the viewer, surround an area which includes the 2D image 1 , or surround all or part of the 2D image 1 .
  • the front of the 2D image 1 is the side of the 2D image 1 which faces the viewer 9 .
  • Such a position in front of the 2D image 1 is necessarily before and within the view 8 of the viewer 9 of the combined scene 13 , whether the image surface 3 is horizontal, vertical, or at some angle with the horizontal or vertical.
  • “horizontal” shall mean the orientation in which the viewer's 9 eyes 10 are arranged, even though not horizontal in relation to the earth's surface.
  • the placement of the VIO 2 in front of the SD image 1 should be substantially within the depth of field of the lenses of the viewer's 9 eyes 10 as they are focused on the 2D image 1 , so that the VIO 2 , and thus the combined scene 13 , is also in focus for both eyes 10 at the same time.
  • VIO 2 should also be a sufficient distance from the 2D image 1 to be distinguishable by the viewer as nearer in space than the 2D image and/or induce two disparate images of the combined scene 13 , one on the retina of each of the viewer's 9 eyes 10 .
  • a fused cyclopean image of the combined scene 13 will be generated in the visual cortex of the viewer 9 .
  • Such a fused image of the combined scene 13 in the viewer 9 then has the quality of stereo vision, not simply distinguishing distance between the 20 image 1 and the VIO 2 , but within the 20 image 1 itself, converting the available monocular depth cues therein to a binocularly fused image and the experience of true depth in the fused cyclopean image.
  • the human brain can reconstruct and appreciate stereo vision in a two-dimensional image by the stimulation to fusion in the visual cortex of a retinal disparity presented by the binocular depth cue, substantially in Panum's fusional region, of the combined scene.
  • the stereo vision provided by the present invention is superior to the prior art in that the cyclopean images produced by the prior art are highly artificial in appearance as earlier indicated.
  • the visual experience provided by the present invention does not suffer those defects, but provides stereo vision which is natural to the viewer, because it is within the viewer's own capacity for stereopsis as derived from the monocular depth cues of the 2D image, and does so without the necessity for the special eye wear required by the prior art.
  • the VIO 2 may have any shape, but should have sufficient extent and be placed in front of the 2D image 1 so as to be visible and in substantial focus when any part of the 2D image 1 is in focus, that is, within the depth of field of the viewer 9 when viewing the combined scene 13 .
  • the placement of the the VIO 2 maybe fixed or adjustable, by any means, such as suspension in position or attachment to the image object 6 .
  • the VIO may be a grid (not shown in FIG. 1 ) between the viewer and the 2D image with wide enough spacing between visible grid elements to minimize interference with the viewing of the 2D image.
  • the preferred VIO 2 is one that surrounds as much of the viewer's “area of attention” 5 in the 2D image 1 as possible, while minimally interfering with the viewer's 9 appreciation of the 2D image 1 .
  • area of attention is defined here to mean the whole or part of a two-dimensional image that a viewer is focused upon and includes the viewer's 9 point of fixation at its center.
  • the preferred VIO 2 is a frame, hereinafter referred to as a VIO/frame 2 , which surrounds the two-dimensional image while obscuring its edges, and does not severely crop the viewer's 9 view of the 2D image.
  • a VIO/frame 2 may have an adjustable aperture 12 , the position of which may be shifted horizontally within the VIO/frame 2 , in order to compensate for the viewer's 9 viewing position relative to the 2D image 1 , the viewer's 9 angle of view, the shape of the image surface 3 , and the distance of the viewer 9 from the 2D image 1 .
  • the VIO/frame 2 may be limited to a sufficient size to substantially bound the area of attention 5 for the viewer at a particular distance from the 2D image, so that no well defined edges 4 of the 2D image 1 are available to the view 8 of the viewer 9 .
  • the placement of the the VIO 2 may be by any means, such as suspension in position or attachment to the image object 6 , fixed or adjustable with respect to distance from and angle with the image surface 3 .
  • the shape of the VIO 2 may also be made to be adjustable between flat, or horizontally or vertically curved, or both.
  • the shape of the image surface 3 may be flat, or horizontally or vertically curved, or both.
  • the VIO 2 may be opaque, translucent, or transparent with distortion.
  • the VIO 2 should also obscure at least half of the opposing edges 4 of the 2D image, but obscuring all of the opposing edges 4 of the 2D image 1 is preferred.
  • a pattern visually discernable by the viewer may be applied to the side of the VIO/frame which faces the viewer, and will have the greatest effect when placed on the most horizontal sides of a VIO/frame by intensifying the retinal disparity of the combined scene in the viewer's horizontally arranged eyes.
  • the enhancement effect of the system and device may be improved, depending on the 2D image 1 , by illumination of the VIO 2 for the viewer from the front, rear, or internally, or where the VIO 2 is itself in whole or in part an illuminating device.
  • the illumination of the VIO 2 may be of various colors and intensities, and may be polarized, in order to intensify the enhancement effect; and the color, intensity and polarization of the illumination of the VIO 2 may be variable over time for the same purpose.
  • Such variability may be programably controlled, controlled with reference to the characteristics of the 2D image 1 , such as brightness, coloration, resolution, shape, program material, monocular depth cues, etc., and/or controlled with reference to the characteristics of the viewer's 9 vision, to improve and/or maintain the enhancement effect.
  • the enhancement effect of the system and device may also be improved by illumination of the 2D image 1 itself, from the front, rear, or from within the image object 6 , where the image object 6 is an illuminating device, such as a light box, video monitor or television set.
  • illumination of the 2D image 1 may be from sources attached to or independent of the VIO 2 , and may be controlled in a manner similar to the illumination of the VIO 2 for the improvement of the enhancement effect. All of the various attributes of the 2D image 1 and the VIO 2 may be combined and controlled, as well as the position of the viewer, to accommodate the vision characteristics for a particular viewer to improve and/or maintain the enhancement effect, and the entire range of such combination and control is included in the invention.
  • the method underlying the system and device has been referred to throughout the disclosure of the system and device, the principal element of which is the induction of retinal disparity in the eyes 10 of the viewer 9 from binocular retinal images of a combined scene 13 , created by the addition of a VIO 2 to the scene of the 2D image 1 .
  • the monocular depth cues are interpreted as binocular depth cues with those of the VIO 2 , and thus transformed into binocular depth cues with the entirety of the combined scene 13 , so that the viewer experiences the enhancement of his or her depth perception in the 2D image 1 as a form of stereo vision.
  • the invention is designed to be effective to enhance depth perception in a 2D image 1 for a viewer who has two eyes and a relatively normal ocular and neural capacity for stereopsis, such effectiveness varying with the level of such capacity in the viewer, the enhancement effect may be available to a viewer who has vision in only one eye where the binocular depth cue may be simulated to appropriately stimulate the viewer's visual cortex. Testing has shown that horizontal motion of a viewer 9 relative to the combined scene 13 using the system and device with only one eye experiences the enhancement effect with the experience of motion parallax between the VIO 2 and the 2D image 1 .

Abstract

A system, device, and method for enhancing depth perception in a two-dimensional image is disclosed providing the induction of retinal disparity in the eyes of the viewer by the placement of a visually identifiable object in front of the two-dimensional image. Upon fusion of such retinal disparity in the viewer, the viewer experiences the enhancement depth perception in the two dimensional image.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application is related to the application for the invention entitled “Enhancement of Visual Perception” by the same inventor filed on Sep. 10, 2005.
  • STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT
  • Not Applicable
  • REFERENCE TO MICROFICHE APPENDIX
  • Not Applicable
  • BACKGROUND OF THE INVENTION
  • The field of the invention pertains to devices, systems, and methods for enhancing the depth perception of a viewer in a two-dimensional image.
  • The present invention provides a device, method, and system for enhancing depth perception of the viewer in the viewing of two-dimensional images of all kinds, including photographs, posters, drawings and paintings, signs, etc. as well as television and motion pictures. In this broader respect, a distinction should be made between stereographic devices and methods which require two separate two-dimensional images of the same scene, sometimes combined, superimposed, or overlaid in a single two-dimensional image, and the extraction and reconstruction of binocular images of that scene for the viewer who has the capacity for stereopsis; and the present invention, which, by stimulating retinal disparity in the viewer, so enhances the perception of the monocular cues for depth in a single two-dimensional image as to convert the viewer's perception of such monocular depth cues to an experience of the fused binocular depth cue of stereo vision. The stereographic devices have been well known for many years, while the principle underlying the present invention, although not yet completely understood, is newly presented in this application. The subject of depth perception in viewing two-dimensional images as it applies to “depth perception of images on a television screen” has been discussed in LeMay, U.S. Pat. No. 5,488,510, but not with respect to depth perception in two-dimensional images generally, such as photographs, posters, paintings, signs, etc. The present invention is distinguishable from LeMay, which uses a window screen type mesh in a device to be worn by a viewer through which a two-dimensional television image is viewed, and creates, according to its inventor, an “illusion”. The present invention does not create an “illusion”, but provides the experience of the binocular fusion of retinally disparate images, and employs the viewer's capacity for stereopsis to enter the experience.
  • The present invention should also be distinguished from the well known effect that is observed with monocular viewing (with one eye) of a two-dimensional image with monocular depth cues against a flat background without such cues. The same effect can also be observed by monocular viewing of a two-dimensional image at the end of an enclosed space. With such a viewing the monocular depth cues in the two-dimensional image become significantly pronounced, albeit seen with only one eye. Such monocular viewing, however, deprives the viewer of the accommodation reflex which occurs with binocular vision that gives the viewer the ability to accurately focus on the two-dimensional image. The result is that although with such monocular viewing the monocular depth cues in the two-dimensional image have an effect greater than if viewed binocularly, the two-dimensional image cannot be seen with the same degree of focus as if seen binocularly. The present invention, on the other hand, induces a retinal disparity in the viewer that results in a fusion experience, and can be seen binocularly with the accurate focus of the accommodation reflex. The accurate focus in turn heightens the fusion experience, and thus the enhancement of depth perception afforded by the present invention.
  • The classification that applies to this invention is generally in U.S. Class 359, “OPTICAL: SYSTEMS AND ELEMENTS”, but the only subclass titles that provide a verbal similarity are 462, “STEROSCOPIC”, and 478, “RELIEF ILLUSION”, the descriptions of neither being applicable to the theory of operability of the present invention.
  • SUMMARY OF THE INVENTION
  • The method underlying the system and device for enhancing depth perception in a substantially two-dimensional image has as the principal element the induction of retinal disparity in the eyes of the viewer from binocular retinal images of a scene including the two-dimensional image, by the addition to the scene of a visually identifiable object in front of the two-dimensional image, to create a “combined scene”. Upon fusion of the retinally disparate images in the visual cortex of the viewer of the monocular depth cues of the two-dimensional image as part of the combined scene, the monocular depth cues are interpreted as binocular depth cues with those of the visually identifiable object, so that the viewer experiences the enhancement of his or her depth perception in the two dimensional image. The preferred visually identifiable object for the system and device is a frame, which sounds the two-dimensional image while obscuring its edges, placed within the depth of field of focus and/or Panum's fusional region of the two dimensional image. The elements of the system and device may have many other features, such as illumination, shape, color, etc. that can add to the enhancement effect by combination and/or control with reference to the qualities of the two-dimensional image and the viewer's vision.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a perspective view of an example of the physical configuration of the system and device,
  • DETAILED DESCRIPTION OF THE INVENTION
  • The invention is a method, system and device for enhancing the depth perception in a substantially two-dimensional image, hereinafter referred to simply as “2D image”, by a viewer thereof. The method underlying the system and device for enhancing depth perception in a substantially 2D image has as the principal element the induction of a slight but sufficient retinal disparity in the eyes of the viewer from binocular retinal images of a scene including the 2D image. Such a disparity in the binocular retinal images of the scene results from the addition to the scene of a visually identifiable object (VIO) in front of the 2D image, so as to create what will hereinafter be referred to as a “combined scene”. Upon fusion of the retinally disparate images in the viewer of the monocular depth cues of the 2D image as part of the combined scene, the monocular depth cues are interpreted as binocular depth cues with those of the VIO, so that the viewer experiences the enhancement of his or her depth perception in the 2D image, “the enhancement effect”. The 2D image is referred to as such regardless of the shape of the surface upon which the 2D image is represented in two dimensions. Therefore the surface upon which the 2D image is presented shall be referred to as the “image surface”, which may be flat or spherical or some other shape, and may be a surface in space which is not associated with a physical object. The object upon which the “image surface” may exist shall hereinafter be referred to as the “image object”, and may be solid, liquid or gaseous. Thus the 2D image may displayed upon the image surface of an image object. The 2D image may be of any kind, including photographs, posters, drawings, paintings, signs, television and computer images, and all forms of front and rear projection images, film or electronic, both static and motion; and may exist on all kinds of objects that present a surface, such as buildings, stretched canvas, concrete slabs, pools of liquid, gas clouds, television and computer monitors (such as CRT, LCD, plasma, and TFT displays), projection screens, etc.
  • As a foundation for understanding the invention and the nature of the depth perception enhancement effect the following currently accepted definitions and principles related to the humanability to appreciate depth may be considered:
  • 1. Binocular visual field: region of overlapping visibility for the two eyes.
  • 2. Point of Fixation (Point of Regard); Point or object on which the eyes are directed and one's sight is fixed.
  • 3. Fovea: Point on the retina on which are focused the rays coming from an object directly regarded.
  • 4. Stereopsis: Perception of depth produced by binocular retinal disparity within Panum's fusional region requiring properly functioning cells in the visual cortex; the ability to distinguish the relative distance of objects with an apparent physical displacement between the objects resulting from the lateral displacement of the eyes that provides two slightly different views of the same object (disparate images), which allows stereoscopic depth discrimination,
  • 5. Monocular depth cues: Visual cues which present information about the relative location of objects using one eye, which include: occlusion or interposition; aerial perspective (atmospheric perspective, aerial haze); linear perspective (convergence of parallel lines); texture gradients; shading and light; relative size; relative motion (monocular movement parallax); and familiar size.
  • 6. Fusion: Neural process in the visual cortex that brings the retinal images in the two eyes to form a single cyclopean image.
  • 7. Panum's fusional region (Panum's fusional space): Region in visual space over which we perceive binocular single vision. (Outside Panum's fusional region physiological diplopia occurs.)
  • 8. Panum's fusional area: Area on the retina of one eye, any point on which, when stimulated simultaneously with a single specific slightly disparate point in the retina of the other eye, will give rise to a single fused image.
  • 9. Retinal disparity: Results from retinally disparate points that are retinal points which give rise to different principal visual directions, which, when within Panum's fusional area (zone of single binocular vision), can be fused resulting in single vision.
  • 10. Binocular retinal rivalry: Alternating suppression of the two eyes resulting in alternating perception of the two retinal images.
  • The present invention and its underlying principle may be understood with reference to FIG. 1. The system and device for enhancement of depth perception includes a 2D image 1 which is displayed on an image surface 3, which may be part of an image object 6, and at least one visually identifiable object (VIO) 2 placed in front of the 2D image 1, so as to present those elements to the viewer 9 in what will hereinafter be referred to as a “combined scene” 13. The preferred VIO 2 should have a visually identifiable boundary 7 which is relatively well defined and sharp, rather than ill-defined and fuzzy. The VIO 2 may obscure a part 11 of the 2D image 1 to the viewer, surround an area which includes the 2D image 1, or surround all or part of the 2D image 1. The front of the 2D image 1 is the side of the 2D image 1 which faces the viewer 9. Such a position in front of the 2D image 1 is necessarily before and within the view 8 of the viewer 9 of the combined scene 13, whether the image surface 3 is horizontal, vertical, or at some angle with the horizontal or vertical. (For the purpose of this disclosure, “horizontal” shall mean the orientation in which the viewer's 9 eyes 10 are arranged, even though not horizontal in relation to the earth's surface.) The placement of the VIO 2 in front of the SD image 1 should be substantially within the depth of field of the lenses of the viewer's 9 eyes 10 as they are focused on the 2D image 1, so that the VIO 2, and thus the combined scene 13, is also in focus for both eyes 10 at the same time. The placement of the VIO 2 should also be a sufficient distance from the 2D image 1 to be distinguishable by the viewer as nearer in space than the 2D image and/or induce two disparate images of the combined scene 13, one on the retina of each of the viewer's 9 eyes 10.
  • With the viewing of the system exemplified in FIG. 1 a fused cyclopean image of the combined scene 13 will be generated in the visual cortex of the viewer 9. Such a fused image of the combined scene 13 in the viewer 9 then has the quality of stereo vision, not simply distinguishing distance between the 20 image 1 and the VIO 2, but within the 20 image 1 itself, converting the available monocular depth cues therein to a binocularly fused image and the experience of true depth in the fused cyclopean image. Thus it seems that the human brain can reconstruct and appreciate stereo vision in a two-dimensional image by the stimulation to fusion in the visual cortex of a retinal disparity presented by the binocular depth cue, substantially in Panum's fusional region, of the combined scene.
  • The stereo vision provided by the present invention is superior to the prior art in that the cyclopean images produced by the prior art are highly artificial in appearance as earlier indicated. The visual experience provided by the present invention does not suffer those defects, but provides stereo vision which is natural to the viewer, because it is within the viewer's own capacity for stereopsis as derived from the monocular depth cues of the 2D image, and does so without the necessity for the special eye wear required by the prior art.
  • The VIO 2 may have any shape, but should have sufficient extent and be placed in front of the 2D image 1 so as to be visible and in substantial focus when any part of the 2D image 1 is in focus, that is, within the depth of field of the viewer 9 when viewing the combined scene 13. The placement of the the VIO 2 maybe fixed or adjustable, by any means, such as suspension in position or attachment to the image object 6. The VIO may be a grid (not shown in FIG. 1) between the viewer and the 2D image with wide enough spacing between visible grid elements to minimize interference with the viewing of the 2D image. However, because the VIO must be clearly present to the viewer as an integral part of the combined scene, the use of a grid as the VIO would inevitably interfere with the viewer's appreciation of the content of the 2D image. Thus the preferred VIO 2 is one that surrounds as much of the viewer's “area of attention” 5 in the 2D image 1 as possible, while minimally interfering with the viewer's 9 appreciation of the 2D image 1. The term “area of attention” 5 is defined here to mean the whole or part of a two-dimensional image that a viewer is focused upon and includes the viewer's 9 point of fixation at its center. The preferred VIO 2 is a frame, hereinafter referred to as a VIO/frame 2, which surrounds the two-dimensional image while obscuring its edges, and does not severely crop the viewer's 9 view of the 2D image. Such a VIO/frame 2 may have an adjustable aperture 12, the position of which may be shifted horizontally within the VIO/frame 2, in order to compensate for the viewer's 9 viewing position relative to the 2D image 1, the viewer's 9 angle of view, the shape of the image surface 3, and the distance of the viewer 9 from the 2D image 1. If the viewer's 9 area of attention 5 is less than the entire 2D image 1, the VIO/frame 2 may be limited to a sufficient size to substantially bound the area of attention 5 for the viewer at a particular distance from the 2D image, so that no well defined edges 4 of the 2D image 1 are available to the view 8 of the viewer 9.
  • The placement of the the VIO 2 may be by any means, such as suspension in position or attachment to the image object 6, fixed or adjustable with respect to distance from and angle with the image surface 3. The shape of the VIO 2 may also be made to be adjustable between flat, or horizontally or vertically curved, or both. Similarly the shape of the image surface 3 may be flat, or horizontally or vertically curved, or both.
  • The VIO 2 may be opaque, translucent, or transparent with distortion. The VIO 2 should also obscure at least half of the opposing edges 4 of the 2D image, but obscuring all of the opposing edges 4 of the 2D image 1 is preferred. A pattern visually discernable by the viewer may be applied to the side of the VIO/frame which faces the viewer, and will have the greatest effect when placed on the most horizontal sides of a VIO/frame by intensifying the retinal disparity of the combined scene in the viewer's horizontally arranged eyes.
  • The enhancement effect of the system and device may be improved, depending on the 2D image 1, by illumination of the VIO 2 for the viewer from the front, rear, or internally, or where the VIO 2 is itself in whole or in part an illuminating device. The illumination of the VIO 2 may be of various colors and intensities, and may be polarized, in order to intensify the enhancement effect; and the color, intensity and polarization of the illumination of the VIO 2 may be variable over time for the same purpose. Such variability may be programably controlled, controlled with reference to the characteristics of the 2D image 1, such as brightness, coloration, resolution, shape, program material, monocular depth cues, etc., and/or controlled with reference to the characteristics of the viewer's 9 vision, to improve and/or maintain the enhancement effect.
  • The enhancement effect of the system and device may also be improved by illumination of the 2D image 1 itself, from the front, rear, or from within the image object 6, where the image object 6 is an illuminating device, such as a light box, video monitor or television set. Such illumination of the 2D image 1 may be from sources attached to or independent of the VIO 2, and may be controlled in a manner similar to the illumination of the VIO 2 for the improvement of the enhancement effect. All of the various attributes of the 2D image 1 and the VIO 2 may be combined and controlled, as well as the position of the viewer, to accommodate the vision characteristics for a particular viewer to improve and/or maintain the enhancement effect, and the entire range of such combination and control is included in the invention.
  • The method underlying the system and device has been referred to throughout the disclosure of the system and device, the principal element of which is the induction of retinal disparity in the eyes 10 of the viewer 9 from binocular retinal images of a combined scene 13, created by the addition of a VIO 2 to the scene of the 2D image 1. Upon fusion in the visual cortex of the viewer 9 of the retinally disparate images of the monocular depth cues of the 2D image 1 on Panum's fusional area, the monocular depth cues are interpreted as binocular depth cues with those of the VIO 2, and thus transformed into binocular depth cues with the entirety of the combined scene 13, so that the viewer experiences the enhancement of his or her depth perception in the 2D image 1 as a form of stereo vision.
  • Although the invention is designed to be effective to enhance depth perception in a 2D image 1 for a viewer who has two eyes and a relatively normal ocular and neural capacity for stereopsis, such effectiveness varying with the level of such capacity in the viewer, the enhancement effect may be available to a viewer who has vision in only one eye where the binocular depth cue may be simulated to appropriately stimulate the viewer's visual cortex. Testing has shown that horizontal motion of a viewer 9 relative to the combined scene 13 using the system and device with only one eye experiences the enhancement effect with the experience of motion parallax between the VIO 2 and the 2D image 1. It thus appears that simulation of such motion parallax between the VIO 2 and the 2D image 1, such as by moving all or part of the VIO 2 horizontally relative to the 2D image 1, or by moving the 2D image 1 horizontally relative to the VIO 2, will contribute to the enhancement effect in a viewer 9 with only one eye, and is included in the system. Such motion parallax may also contribute to the enhancement effect for a binocular viewer 9.
  • While the invention has been disclosed in connection with the example of a preferred embodiment, it will be understood that there is no intention to limit the invention to the particular embodiment shown, and that this disclosure is intended to cover the general application of the method and the various alternative and equivalent constructions included within the spirit and scope of the appended claims.

Claims (36)

1. A system and device for enhancement of depth perception in a viewer of a two-dimensional image comprising:
a) a two-dimensional image with one or more monocular depth cues displayed on an image surface; and
b) at least one visually identifiable object placed in front of said image surface to create a combined scene with said two-dimensional image, said visually identifiable object being:
1) substantially within the depth of field of focus of said viewer's eyes when said two-dimensional image is in focus for said viewer,
2) without substantially interfering with the view of the viewer of said two-dimensional image, and
3) a sufficient distance from the two-dimensional image to induce disparate binocular retinal images in the eyes of the viewer;
having the effect of enhancement of the viewer's depth perception in viewing said two-dimensional image by and/or with the fusion of said binocular images of the combined scene in the viewer.
2. The system and device for enhancement of depth perception of claim 1 wherein said at least one visually identifiable object is substantially within the Panum's fusional region for said two-dimensional image.
3. A system and device for enhancement of depth perception in a viewer of a two-dimensional image comprising:
a) a two-dimensional image displayed on an image surface; and
b) at least one visually identifiable object positioned in front of the image surface, said at least one visually identifiable object being:
1) substantially in focus for said viewer with said two-dimensional image, and
2) at a sufficient distance from said image surface to induce disparate retinal images in the eyes of the viewer;
having the effect of enhancement of the viewer's depth perception in viewing said two-dimensional image by and/or with the fusion of said disparate retinal images in the viewer.
4. The system and device for enhancement of depth perception of claim 3 wherein said two-dimensional image includes one or more monocular depth cues.
5. The system and device for enhancement of depth perception of claim 3 wherein said image surface is curved, either horizontally, vertically, or horizontally and vertically.
6. The system and device for enhancement of depth perception of claim 3 wherein said at least one visually identifiable object is a frame.
7. The system and device for enhancement of depth perception of claim 3 wherein said two-dimensional image is in horizontal motion within the viewer's view relative to the visually identifiable object, such horizontal motion being controlled with reference to the brightness, color, and/or monocular depth cues of said two-dimensional image, or otherwise programably controlled to intensify the enhancement effect and/or accommodate said viewer.
8. The system and device for enhancement of depth perception of claim 3 wherein said visually identifiable object is in horizontal motion within said viewer's view relative to the image surface, such horizontal motion being controlled with reference to the brightness, color, and/or monocular depth cues of said two-dimensional image, or otherwise programably controlled to intensify the enhancement effect and/or accommodate said viewer.
9. The system and device for enhancement of depth perception of claim 3 wherein said at least one visually identifiable object is substantially within the Panum's fusional region for said two-dimensional image.
10. The system and device for enhancement of depth perception of claim 3 wherein said at least one visually identifiable object is illuminated on the side which faces said viewer.
11. The system and device for enhancement of depth perception of claim 10 wherein the source of illumination is attached to or within said at least one visually identifiable object.
12. The system and device for enhancement of depth perception of claim 11 wherein the illumination of said at least one visually identifiable object is variable in intensity, color and/or polarization, such variability being controlled with reference to the brightness, color, and/or monocular depth cues of said two-dimensional image, or otherwise programably controlled to intensify the enhancement effect and/or accommodate said viewer.
13. A system and device for enhancement of depth perception in a viewer of a two-dimensional image comprising:
a) a two-dimensional image with one or more monocular depth cues, the two-dimensional image being displayed on an image surface; and
b) at least one visually identifiable object positioned in front of the image surface, said at least one visually identifiable object being:
1) at a sufficient distance from said image surface to induce disparate retinal images in the eyes of the viewer, and
2) substantially in focus for said viewer with said two-dimensional image and/or substantially within the Panum's fusional region for said two-dimensional image;
having the effect of enhancement of the viewer's depth perception in viewing said two-dimensional image by and/or with the fusion of said disparate retinal images in the viewer.
14. The system and device for enhancement of depth perception of claim 13 wherein said image surface is curved, either horizontally, vertically, or horizontally and vertically.
15. The system and device for enhancement of depth perception of claim 13 wherein said two-dimensional image is in horizontal motion within the viewer's view relative to the visually identifiable object, such horizontal motion being controlled with reference to the brightness, color, and/or monocular depth cues of said two-dimensional image, or otherwise programably controlled to intensify the enhancement effect and/or accommodate said viewer.
16. The system and device for enhancement of depth perception of claim 13 wherein said visually identifiable object is in horizontal motion within said viewer's view relative to the image surface, such horizontal motion being controlled with reference to the brightness, color, and/or monocular depth cues of said two-dimensional image, or otherwise programably controlled to intensify the enhancement effect and/or accommodate said viewer.
17. The system and device for enhancement of depth perception of claim 13 wherein said at least one visually identifiable object is illuminated on the side which faces said viewer.
18. The system and device for enhancement of depth perception of claim 17 wherein the illumination of said at least one visually identifiable object is variable in intensity, color and/or polarization, such variability being controlled with reference to the brightness, color, and/or monocular depth cues of said two-dimensional image, or otherwise programably controlled to intensify the enhancement effect and/or accommodate said viewer.
19. The system and device for enhancement of depth perception of claim 13 wherein said at least one visually identifiable object is a frame.
20. The system and device for enhancement of depth perception of claim 19 wherein the horizontal size of said aperture of said frame is variable within the viewer's view relative to the two-dimensional image, such variability being controlled with reference to the brightness, color, and/or monocular depth cues of said two-dimensional image, or otherwise programably controlled to intensify the enhancement effect and/or accommodate said viewer.
21. The system and device for enhancement of depth perception of claim 19 wherein the shape of said frame and/or the size of the aperture of said frame is variable, such variability being controlled with reference to the brightness, color, and/or monocular depth cues of said two-dimensional image, or otherwise programably controlled to intensify the enhancement effect and/or accommodate said viewer.
22. The system and device for enhancement of depth perception of claim 19 wherein the aperture of said frame has the same shape as said two-dimensional image, and is of sufficient size so that substantially all of said two-dimensional image appears to said viewer to be within said aperture.
23. The system and device for enhancement of depth perception of claim 13 wherein at least half of the opposing edges of said two-dimensional image are obscured from the view of said viewer of said two-dimensional image.
24. The system and device for enhancement of depth perception of claim 13 wherein the opposing vertical edges of sad two-dimensional images are obscured from the view of said viewer of said two-dimensional.
25. The system and device for enhancement of depth perception of claim 24 wherein said image surface is curved, either horizontally, vertically, or horizontally and vertically.
26. The system and device for enhancement of depth perception of claim 13 wherein said at least one visually identifiable object is illuminated, and the source of illumination is attached to or within said at least one visually identifiable object.
27. The system and device for enhancement of depth perception of claim 13 wherein said at least one visually identifiable object is recognizable as being a particular thing distinguishable by said viewer as nearer in space to said viewer than said two-dimensional image.
28. The system and device for enhancement of depth perception of claim 13 wherein said at least one visually identifiable object is a partial frame further comprised of two vertical sides.
29. The system and device for enhancement of depth perception of claim 13 wherein said two-dimensional image is illuminated.
30. The system and device for enhancement of depth perception of claim 29 wherein the illumination of said two-dimensional image is from a source within or attached to said visually identifiable object, so that said viewer is shielded from such illumination.
31. The system and device for enhancement of depth perception of claim 29 wherein the illumination of said two-dimensional image is variable in intensity, color and/or polarization, such variability being controlled with reference to the brightness, color, and/or monocular depth cues of said two-dimensional image, or otherwise programably controlled to intensify the enhancement effect and/or accommodate said viewer.
32. A method for enhancement of depth perception in a viewer of a two-dimensional image comprising:
a) selection of a two-dimensional image with one or more monocular depth cues;
b) induction of retinal disparity in the viewer from binocular retinal images of a combined scene created by the addition of a visually identifiable object to the scene of the two-dimensional image substantially within the depth of field of focus of said viewer's eyes when said two-dimensional image is in focus for said viewer; and
c) fusion of such retinally disparate images in the visual cortex of the viewer with the interpretation of said one or more monocular depth cues as binocular depth cues with those of the visually identifiable object in the combined scene.
33. A method for enhancement of depth perception in a viewer of a two-dimensional image comprising:
a) selection of a substantially two-dimensional image with at least one monocular depth cue;
b) selection of one or more visually identifiable objects for viewing with the two-dimensional image;
c) formation of a combined scene by placement of at least one visually identifiable object placed in front of said two-dimensional image and within the view of the viewer, which is:
1) substantially within the depth of field of focus of said viewer's eyes when said two-dimensional image is in focus for said viewer;
2) at a sufficient distance from said two-dimensional image so as to be clearly distinguishable by said viewer as nearer in space to said viewer than said two-dimensional image;
3) without interfering with the view of said viewer of said two-dimensional image;
d) obscuring at least two opposing edges of said two dimensional image;
e) induction in the viewer of a sufficient retinal disparity in said viewer by the visual presentation to said viewer of the combined scene; and
f) effecting the enhancement of said viewer's depth perception in said two-dimensional image by and/or with the viewer's fusion of the disparate retinal images of the combined scene.
34. A method for enhancement of depth perception in a viewer of a two-dimensional image comprising the induction of disparate retinal images in the viewer of the two-dimensional image with the formation of a combined scene of said two-dimensional image and at least one visually identifiable object placed in front of said two-dimensional image and before said viewer, said at least one visually identifiable object being:
a) substantially within the depth of field of focus of said viewer's eyes when said two-dimensional image is in focus for said viewer;
b) at a sufficient distance from said two-dimensional image so as to be distinguishable by the viewer as nearer in space than said two-dimensional image;
c) without interfering with the view of the viewer of said two-dimensional image;
effecting the enhancement of said viewer's depth perception in said two-dimensional image by and/or with the viewer's fusion of said disparate retinal images.
35. A method for enhancement of depth perception in a viewer of a two-dimensional image comprising:
a) induction of disparate retinal images in the viewer of a two-dimensional image by the placement of a visually identifiable object in front of the two-dimensional image within the depth of field of focus of said viewer's eyes that includes said two-dimensional image; and
b) effecting the enhancement of said viewer's depth perception in said two-dimensional image by and/or with the viewer's fusion of said disparate retinal images.
36. A method for enhancement of depth perception of claim 35 wherein:
a) the placement of the visually identifiable object is within the Panum's fusional region for said two-dimensional image and a sufficient distance from said two-dimensional image so as to be distinguishable by the viewer as nearer in space than said two-dimensional image;
b) the two dimensional image has one or more monocular depth cues; and
c) at least two of the opposing edges of said two-dimensional image are obscured to the view of the viewer.
US11/033,186 2005-01-11 2005-01-11 Enhancement of depth perception Active US7073908B1 (en)

Priority Applications (12)

Application Number Priority Date Filing Date Title
US11/033,186 US7073908B1 (en) 2005-01-11 2005-01-11 Enhancement of depth perception
PCT/US2005/047577 WO2006076173A2 (en) 2005-01-11 2005-12-31 Enhancement of visual perception
MX2007008424A MX2007008424A (en) 2005-01-11 2005-12-31 Enhancement of visual perception.
CNA2005800464005A CN101133360A (en) 2005-01-11 2005-12-31 Enhancement of visual perception
JP2007551283A JP2008527918A (en) 2005-01-11 2005-12-31 Enhance visual perception
CA002593243A CA2593243A1 (en) 2005-01-11 2005-12-31 Enhancement of visual perception
EA200701457A EA013779B1 (en) 2005-01-11 2005-12-31 Enhancement of visual perception
EP05856050A EP1836531A4 (en) 2005-01-11 2005-12-31 Enhancement of visual perception
BRPI0518498-3A BRPI0518498A2 (en) 2005-01-11 2005-12-31 increased visual perception
AU2005324379A AU2005324379A1 (en) 2005-01-11 2005-12-31 Enhancement of visual perception
ZA200705671A ZA200705671B (en) 2005-01-11 2007-07-10 Enhancement of visual perception
CL200702009A CL2007002009A1 (en) 2005-01-11 2007-07-10 SYSTEM AND METHOD TO IMPROVE THE PERCEPTION OF A TWO-DIMENSIONAL IMAGE THAT PROVIDES THE INDUCTION OF RETINAL DISPARITY IN THE VIZUALIZER AND WITH FUSION OF RETINALLY DISPATCHED IMAGES.

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US11/033,186 US7073908B1 (en) 2005-01-11 2005-01-11 Enhancement of depth perception

Publications (2)

Publication Number Publication Date
US7073908B1 US7073908B1 (en) 2006-07-11
US20060152803A1 true US20060152803A1 (en) 2006-07-13

Family

ID=36643987

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/033,186 Active US7073908B1 (en) 2005-01-11 2005-01-11 Enhancement of depth perception

Country Status (4)

Country Link
US (1) US7073908B1 (en)
CN (1) CN101133360A (en)
CL (1) CL2007002009A1 (en)
ZA (1) ZA200705671B (en)

Cited By (109)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090244072A1 (en) * 2008-03-28 2009-10-01 Vldimir Pugach Method for correct reproduction of moving spatial images on a flat screen
US20100046837A1 (en) * 2006-11-21 2010-02-25 Koninklijke Philips Electronics N.V. Generation of depth map for an image
US20100105424A1 (en) * 2008-10-23 2010-04-29 Smuga Michael A Mobile Communications Device User Interface
US8086275B2 (en) 2008-10-23 2011-12-27 Microsoft Corporation Alternative inputs of a mobile communications device
US8175653B2 (en) 2009-03-30 2012-05-08 Microsoft Corporation Chromeless user interface
US8238876B2 (en) 2009-03-30 2012-08-07 Microsoft Corporation Notifications
US8269736B2 (en) 2009-05-22 2012-09-18 Microsoft Corporation Drop target gestures
FR2972817A1 (en) * 2011-03-16 2012-09-21 Christophe Lanfranchi Device for perceiving stereoscopic images e.g. three-dimensional lenticular images, in stereoscopic TV, has mask provided at specific distance from stereoscopic images, where mask is utilized for masking circumference of images for observer
US8355698B2 (en) 2009-03-30 2013-01-15 Microsoft Corporation Unlock screen
US8385952B2 (en) 2008-10-23 2013-02-26 Microsoft Corporation Mobile communications device user interface
US8411046B2 (en) 2008-10-23 2013-04-02 Microsoft Corporation Column organization of content
US20130215108A1 (en) * 2012-02-21 2013-08-22 Pelican Imaging Corporation Systems and Methods for the Manipulation of Captured Light Field Image Data
US8560959B2 (en) 2010-12-23 2013-10-15 Microsoft Corporation Presenting an application change through a tile
US8687023B2 (en) 2011-08-02 2014-04-01 Microsoft Corporation Cross-slide gesture to select and rearrange
US8689123B2 (en) 2010-12-23 2014-04-01 Microsoft Corporation Application reporting in an application-selectable user interface
US8830270B2 (en) 2011-09-10 2014-09-09 Microsoft Corporation Progressively indicating new content in an application-selectable user interface
US8836648B2 (en) 2009-05-27 2014-09-16 Microsoft Corporation Touch pull-in gesture
US20140268324A1 (en) * 2013-03-18 2014-09-18 3-D Virtual Lens Technologies, Llc Method of displaying 3d images from 2d source images using a barrier grid
US8845110B1 (en) 2010-12-23 2014-09-30 Rawles Llc Powered augmented reality projection accessory display device
US8845107B1 (en) * 2010-12-23 2014-09-30 Rawles Llc Characterization of a scene with structured light
US8893033B2 (en) 2011-05-27 2014-11-18 Microsoft Corporation Application notifications
US8905551B1 (en) 2010-12-23 2014-12-09 Rawles Llc Unpowered augmented reality projection accessory display device
US8922575B2 (en) 2011-09-09 2014-12-30 Microsoft Corporation Tile cache
US8933952B2 (en) 2011-09-10 2015-01-13 Microsoft Corporation Pre-rendering new content for an application-selectable user interface
US8935631B2 (en) 2011-09-01 2015-01-13 Microsoft Corporation Arranging tiles
US8990733B2 (en) 2010-12-20 2015-03-24 Microsoft Technology Licensing, Llc Application-launching interface for multiple modes
US9025895B2 (en) 2011-09-28 2015-05-05 Pelican Imaging Corporation Systems and methods for decoding refocusable light field image files
US9041823B2 (en) 2008-05-20 2015-05-26 Pelican Imaging Corporation Systems and methods for performing post capture refocus using images captured by camera arrays
US9049411B2 (en) 2008-05-20 2015-06-02 Pelican Imaging Corporation Camera arrays incorporating 3×3 imager configurations
US9052820B2 (en) 2011-05-27 2015-06-09 Microsoft Technology Licensing, Llc Multi-application environment
US9100586B2 (en) 2013-03-14 2015-08-04 Pelican Imaging Corporation Systems and methods for photometric normalization in array cameras
US9100635B2 (en) 2012-06-28 2015-08-04 Pelican Imaging Corporation Systems and methods for detecting defective camera arrays and optic arrays
US9104440B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
US9111326B1 (en) 2010-12-21 2015-08-18 Rawles Llc Designation of zones of interest within an augmented reality environment
US9118782B1 (en) 2011-09-19 2015-08-25 Amazon Technologies, Inc. Optical interference mitigation
US9123118B2 (en) 2012-08-21 2015-09-01 Pelican Imaging Corporation System and methods for measuring depth using an array camera employing a bayer filter
US9128605B2 (en) 2012-02-16 2015-09-08 Microsoft Technology Licensing, Llc Thumbnail-image selection of applications
US9134593B1 (en) 2010-12-23 2015-09-15 Amazon Technologies, Inc. Generation and modulation of non-visible structured light for augmented reality projection system
US9143711B2 (en) 2012-11-13 2015-09-22 Pelican Imaging Corporation Systems and methods for array camera focal plane control
US9158445B2 (en) 2011-05-27 2015-10-13 Microsoft Technology Licensing, Llc Managing an immersive interface in a multi-application immersive environment
US9185276B2 (en) 2013-11-07 2015-11-10 Pelican Imaging Corporation Methods of manufacturing array camera modules incorporating independently aligned lens stacks
US9210392B2 (en) 2012-05-01 2015-12-08 Pelican Imaging Coporation Camera modules patterned with pi filter groups
US9214013B2 (en) 2012-09-14 2015-12-15 Pelican Imaging Corporation Systems and methods for correcting user identified artifacts in light field images
US9223472B2 (en) 2011-12-22 2015-12-29 Microsoft Technology Licensing, Llc Closing applications
US9247117B2 (en) 2014-04-07 2016-01-26 Pelican Imaging Corporation Systems and methods for correcting for warpage of a sensor array in an array camera module by introducing warpage into a focal plane of a lens stack array
US9244802B2 (en) 2011-09-10 2016-01-26 Microsoft Technology Licensing, Llc Resource user interface
US9253380B2 (en) 2013-02-24 2016-02-02 Pelican Imaging Corporation Thin form factor computational array cameras and modular array cameras
US9264610B2 (en) 2009-11-20 2016-02-16 Pelican Imaging Corporation Capturing and processing of images including occlusions captured by heterogeneous camera arrays
US9329774B2 (en) 2011-05-27 2016-05-03 Microsoft Technology Licensing, Llc Switching back to a previously-interacted-with application
US20160131916A1 (en) * 2013-07-01 2016-05-12 Roland Wolf Device and method for generating a spatial depth effect of image information present in a 2d image area
US9383917B2 (en) 2011-03-28 2016-07-05 Microsoft Technology Licensing, Llc Predictive tiling
US9426361B2 (en) 2013-11-26 2016-08-23 Pelican Imaging Corporation Array camera configurations incorporating multiple constituent array cameras
US9423951B2 (en) 2010-12-31 2016-08-23 Microsoft Technology Licensing, Llc Content-based snap point
US9430130B2 (en) 2010-12-20 2016-08-30 Microsoft Technology Licensing, Llc Customization of an immersive environment
US9438888B2 (en) 2013-03-15 2016-09-06 Pelican Imaging Corporation Systems and methods for stereo imaging with camera arrays
US9450952B2 (en) 2013-05-29 2016-09-20 Microsoft Technology Licensing, Llc Live tiles without application-code execution
US9451822B2 (en) 2014-04-10 2016-09-27 Microsoft Technology Licensing, Llc Collapsible shell cover for computing device
US9462164B2 (en) 2013-02-21 2016-10-04 Pelican Imaging Corporation Systems and methods for generating compressed light field representation data using captured light fields, array geometry, and parallax information
US9497429B2 (en) 2013-03-15 2016-11-15 Pelican Imaging Corporation Extended color processing on pelican array cameras
US9497370B2 (en) 2013-03-15 2016-11-15 Pelican Imaging Corporation Array camera architecture implementing quantum dot color filters
US9508194B1 (en) 2010-12-30 2016-11-29 Amazon Technologies, Inc. Utilizing content output devices in an augmented reality environment
US9516222B2 (en) 2011-06-28 2016-12-06 Kip Peli P1 Lp Array cameras incorporating monolithic array camera modules with high MTF lens stacks for capture of images used in super-resolution processing
US9521319B2 (en) 2014-06-18 2016-12-13 Pelican Imaging Corporation Array cameras and array camera modules including spectral filters disposed outside of a constituent image sensor
US9557909B2 (en) 2011-09-09 2017-01-31 Microsoft Technology Licensing, Llc Semantic zoom linguistic helpers
US9578259B2 (en) 2013-03-14 2017-02-21 Fotonation Cayman Limited Systems and methods for reducing motion blur in images or video in ultra low light with array cameras
US9607315B1 (en) 2010-12-30 2017-03-28 Amazon Technologies, Inc. Complementing operation of display devices in an augmented reality environment
US9633442B2 (en) 2013-03-15 2017-04-25 Fotonation Cayman Limited Array cameras including an array camera module augmented with a separate camera
US9658766B2 (en) 2011-05-27 2017-05-23 Microsoft Technology Licensing, Llc Edge gesture
US9665384B2 (en) 2005-08-30 2017-05-30 Microsoft Technology Licensing, Llc Aggregation of computing device settings
US9674335B2 (en) 2014-10-30 2017-06-06 Microsoft Technology Licensing, Llc Multi-configuration input device
US9721386B1 (en) 2010-12-27 2017-08-01 Amazon Technologies, Inc. Integrated augmented reality environment
US9733486B2 (en) 2013-03-13 2017-08-15 Fotonation Cayman Limited Systems and methods for controlling aliasing in images captured by an array camera for use in super-resolution processing
US9741118B2 (en) 2013-03-13 2017-08-22 Fotonation Cayman Limited System and methods for calibration of an array camera
US9769293B2 (en) 2014-04-10 2017-09-19 Microsoft Technology Licensing, Llc Slider cover for computing device
US9766380B2 (en) 2012-06-30 2017-09-19 Fotonation Cayman Limited Systems and methods for manufacturing camera modules using active alignment of lens stack arrays and sensors
US9774789B2 (en) 2013-03-08 2017-09-26 Fotonation Cayman Limited Systems and methods for high dynamic range imaging using array cameras
US9794476B2 (en) 2011-09-19 2017-10-17 Fotonation Cayman Limited Systems and methods for controlling aliasing in images captured by an array camera for use in super resolution processing using pixel apertures
US9800856B2 (en) 2013-03-13 2017-10-24 Fotonation Cayman Limited Systems and methods for synthesizing images from image data captured by an array camera using restricted depth of field depth maps in which depth estimation precision varies
US9813616B2 (en) 2012-08-23 2017-11-07 Fotonation Cayman Limited Feature based high resolution motion estimation from low resolution images captured using an array source
US9841874B2 (en) 2014-04-04 2017-12-12 Microsoft Technology Licensing, Llc Expandable application representation
US9866739B2 (en) 2011-05-11 2018-01-09 Fotonation Cayman Limited Systems and methods for transmitting and receiving array camera image data
US9888194B2 (en) 2013-03-13 2018-02-06 Fotonation Cayman Limited Array camera architecture implementing quantum film image sensors
US9898856B2 (en) 2013-09-27 2018-02-20 Fotonation Cayman Limited Systems and methods for depth-assisted perspective distortion correction
US9936148B2 (en) 2010-05-12 2018-04-03 Fotonation Cayman Limited Imager array interfaces
US9942474B2 (en) 2015-04-17 2018-04-10 Fotonation Cayman Limited Systems and methods for performing high speed video capture and depth estimation using array cameras
US9955070B2 (en) 2013-03-15 2018-04-24 Fotonation Cayman Limited Systems and methods for synthesizing high resolution images using image deconvolution based on motion and depth information
US9986224B2 (en) 2013-03-10 2018-05-29 Fotonation Cayman Limited System and methods for calibration of an array camera
US10089740B2 (en) 2014-03-07 2018-10-02 Fotonation Limited System and methods for depth regularization and semiautomatic interactive matting using RGB-D images
US10122993B2 (en) 2013-03-15 2018-11-06 Fotonation Limited Autofocus system for a conventional camera that uses depth information from an array camera
US10119808B2 (en) 2013-11-18 2018-11-06 Fotonation Limited Systems and methods for estimating depth from projected texture using camera arrays
CN109257540A (en) * 2018-11-05 2019-01-22 浙江舜宇光学有限公司 Take the photograph photography bearing calibration and the camera of lens group more
US10250871B2 (en) 2014-09-29 2019-04-02 Fotonation Limited Systems and methods for dynamic calibration of array cameras
US10254942B2 (en) 2014-07-31 2019-04-09 Microsoft Technology Licensing, Llc Adaptive sizing and positioning of application windows
US10353566B2 (en) 2011-09-09 2019-07-16 Microsoft Technology Licensing, Llc Semantic zoom animations
US10366472B2 (en) 2010-12-14 2019-07-30 Fotonation Limited Systems and methods for synthesizing high resolution images using images captured by an array of independently controllable imagers
US10390005B2 (en) 2012-09-28 2019-08-20 Fotonation Limited Generating images from light fields utilizing virtual viewpoints
US10482618B2 (en) 2017-08-21 2019-11-19 Fotonation Limited Systems and methods for hybrid depth regularization
US10592080B2 (en) 2014-07-31 2020-03-17 Microsoft Technology Licensing, Llc Assisted presentation of application windows
US10642365B2 (en) 2014-09-09 2020-05-05 Microsoft Technology Licensing, Llc Parametric inertia and APIs
US10678412B2 (en) 2014-07-31 2020-06-09 Microsoft Technology Licensing, Llc Dynamic joint dividers for application windows
US11270110B2 (en) 2019-09-17 2022-03-08 Boston Polarimetrics, Inc. Systems and methods for surface modeling using polarization cues
US11290658B1 (en) 2021-04-15 2022-03-29 Boston Polarimetrics, Inc. Systems and methods for camera exposure control
US11302012B2 (en) 2019-11-30 2022-04-12 Boston Polarimetrics, Inc. Systems and methods for transparent object segmentation using polarization cues
US11525906B2 (en) 2019-10-07 2022-12-13 Intrinsic Innovation Llc Systems and methods for augmentation of sensor systems and imaging systems with polarization
US11580667B2 (en) 2020-01-29 2023-02-14 Intrinsic Innovation Llc Systems and methods for characterizing object pose detection and measurement systems
US11689813B2 (en) 2021-07-01 2023-06-27 Intrinsic Innovation Llc Systems and methods for high dynamic range imaging using crossed polarizers
US11792538B2 (en) 2008-05-20 2023-10-17 Adeia Imaging Llc Capturing and processing of images including occlusions focused on an image sensor by a lens stack array
US11797863B2 (en) 2020-01-30 2023-10-24 Intrinsic Innovation Llc Systems and methods for synthesizing data for training statistical models on different imaging modalities including polarized images
US11954886B2 (en) 2021-04-15 2024-04-09 Intrinsic Innovation Llc Systems and methods for six-degree of freedom pose estimation of deformable objects

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7604348B2 (en) * 2001-01-23 2009-10-20 Kenneth Martin Jacobs Continuous adjustable 3deeps filter spectacles for optimized 3deeps stereoscopic viewing and its control method and means
US7545405B2 (en) * 2006-05-12 2009-06-09 Anthony Italo Provitola Enhancement of visual perception II
US7612795B2 (en) * 2006-05-12 2009-11-03 Anthony Italo Provitola Enhancement of visual perception III
US8531507B2 (en) * 2007-10-16 2013-09-10 Anthony Italo Provitola Enhancement of visual perception IV
WO2010010331A1 (en) * 2008-07-23 2010-01-28 Atelier Vision Limited The compositional structure, mechanisms and processes for the inclusion of binocular stereo information into representational media
JP5323413B2 (en) * 2008-07-25 2013-10-23 シャープ株式会社 Additional data generation system
TW201506450A (en) * 2013-08-01 2015-02-16 Bandai Co Image display device and packaging container
US10547831B2 (en) 2013-11-27 2020-01-28 Semiconductor Energy Laboratory Co., Ltd. Display device and display device frame
ES2575211B1 (en) * 2014-11-25 2017-02-23 Davalor Salud, S.L. METHOD OF REPRODUCTION OF IMAGES WITH THREE-DIMENSIONAL APPEARANCE
WO2022267374A1 (en) * 2021-06-24 2022-12-29 深圳市立体通技术有限公司 Autostereoscopic display assembly and autostereoscopic display apparatus
CN113655626B (en) * 2021-07-23 2023-06-13 深圳市立体通技术有限公司 Naked-eye three-dimensional display assembly and naked-eye three-dimensional display device

Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2679188A (en) * 1949-08-30 1954-05-25 Gould Leigh Depth illusion attachment device for optical projectors
US2943964A (en) * 1958-06-17 1960-07-05 Goldenberg Max Television viewing screen
US3582961A (en) * 1967-05-01 1971-06-01 Chushiro Shindo System for displaying a two-dimensional photographic picture in three dimensions
US3701581A (en) * 1971-11-15 1972-10-31 Gen Electric Stereoscopic enhancement of pictorial displays
US4517558A (en) * 1982-05-03 1985-05-14 International Game Technology Three dimensional video screen display effect
US4633322A (en) * 1984-02-20 1986-12-30 Fourny Denise G Screen to be disposed in front of a cathode ray screen, comprised by monofilaments forming micromeshes and having, on one surface, a translucent film
US4819085A (en) * 1986-06-09 1989-04-04 Liang Paul M Screen for cathode ray tubes
US5172266A (en) * 1989-09-19 1992-12-15 Texas Instruments Incorporated Real time three dimensional display
US5257130A (en) * 1992-01-30 1993-10-26 The Walt Disney Company Apparatus and method for creating a real image illusion
US5291330A (en) * 1989-11-03 1994-03-01 Joseph Daniels Method and apparatus for enhancing image resolution by means of a multiplicity of phase objects and an optional external radiant flux
US5488510A (en) * 1994-07-26 1996-01-30 Lemay; Edward J. Enhanced depth perception viewing device for television
US5510832A (en) * 1993-12-01 1996-04-23 Medi-Vision Technologies, Inc. Synthesized stereoscopic imaging system and method
US5541642A (en) * 1990-12-21 1996-07-30 Delta Systems Design Ltd. Stereoscopic imaging systems
US5556184A (en) * 1991-10-09 1996-09-17 Nader-Esfahani; Rahim Imaginograph
US5751927A (en) * 1991-03-26 1998-05-12 Wason; Thomas D. Method and apparatus for producing three dimensional displays on a two dimensional surface
US5806218A (en) * 1994-03-31 1998-09-15 Central Research Laboratories Limited Border for an image
US5886771A (en) * 1997-12-03 1999-03-23 Evergreen Innovations, L.L.C. Polarizing system for motion visual depth effects
US6414681B1 (en) * 1994-10-12 2002-07-02 Canon Kabushiki Kaisha Method and apparatus for stereo image display
US6530662B1 (en) * 2000-09-19 2003-03-11 Disney Enterprises, Inc. System and method for enhancing the realism of a displayed image
US6536146B2 (en) * 2001-05-25 2003-03-25 Steven Ericson Movement effect visual display
US6742892B2 (en) * 2002-04-16 2004-06-01 Exercise Your Eyes, Llc Device and method for exercising eyes
US20050052617A1 (en) * 2003-08-22 2005-03-10 Denso Corporation Virtual image display apparatus
US6929369B2 (en) * 2002-09-17 2005-08-16 Sharp Kabushiki Kaisha Autostereoscopic display
US20050206582A1 (en) * 2001-11-09 2005-09-22 Bell Gareth P Depth fused display

Patent Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2679188A (en) * 1949-08-30 1954-05-25 Gould Leigh Depth illusion attachment device for optical projectors
US2943964A (en) * 1958-06-17 1960-07-05 Goldenberg Max Television viewing screen
US3582961A (en) * 1967-05-01 1971-06-01 Chushiro Shindo System for displaying a two-dimensional photographic picture in three dimensions
US3701581A (en) * 1971-11-15 1972-10-31 Gen Electric Stereoscopic enhancement of pictorial displays
US4517558A (en) * 1982-05-03 1985-05-14 International Game Technology Three dimensional video screen display effect
US4633322A (en) * 1984-02-20 1986-12-30 Fourny Denise G Screen to be disposed in front of a cathode ray screen, comprised by monofilaments forming micromeshes and having, on one surface, a translucent film
US4819085A (en) * 1986-06-09 1989-04-04 Liang Paul M Screen for cathode ray tubes
US5172266A (en) * 1989-09-19 1992-12-15 Texas Instruments Incorporated Real time three dimensional display
US5291330A (en) * 1989-11-03 1994-03-01 Joseph Daniels Method and apparatus for enhancing image resolution by means of a multiplicity of phase objects and an optional external radiant flux
US5541642A (en) * 1990-12-21 1996-07-30 Delta Systems Design Ltd. Stereoscopic imaging systems
US5751927A (en) * 1991-03-26 1998-05-12 Wason; Thomas D. Method and apparatus for producing three dimensional displays on a two dimensional surface
US5556184A (en) * 1991-10-09 1996-09-17 Nader-Esfahani; Rahim Imaginograph
US5257130A (en) * 1992-01-30 1993-10-26 The Walt Disney Company Apparatus and method for creating a real image illusion
US5510832A (en) * 1993-12-01 1996-04-23 Medi-Vision Technologies, Inc. Synthesized stereoscopic imaging system and method
US5806218A (en) * 1994-03-31 1998-09-15 Central Research Laboratories Limited Border for an image
US5488510A (en) * 1994-07-26 1996-01-30 Lemay; Edward J. Enhanced depth perception viewing device for television
US6414681B1 (en) * 1994-10-12 2002-07-02 Canon Kabushiki Kaisha Method and apparatus for stereo image display
US5886771A (en) * 1997-12-03 1999-03-23 Evergreen Innovations, L.L.C. Polarizing system for motion visual depth effects
US6530662B1 (en) * 2000-09-19 2003-03-11 Disney Enterprises, Inc. System and method for enhancing the realism of a displayed image
US6536146B2 (en) * 2001-05-25 2003-03-25 Steven Ericson Movement effect visual display
US20050206582A1 (en) * 2001-11-09 2005-09-22 Bell Gareth P Depth fused display
US6742892B2 (en) * 2002-04-16 2004-06-01 Exercise Your Eyes, Llc Device and method for exercising eyes
US6929369B2 (en) * 2002-09-17 2005-08-16 Sharp Kabushiki Kaisha Autostereoscopic display
US20050052617A1 (en) * 2003-08-22 2005-03-10 Denso Corporation Virtual image display apparatus

Cited By (253)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9665384B2 (en) 2005-08-30 2017-05-30 Microsoft Technology Licensing, Llc Aggregation of computing device settings
US8340422B2 (en) 2006-11-21 2012-12-25 Koninklijke Philips Electronics N.V. Generation of depth map for an image
US20100046837A1 (en) * 2006-11-21 2010-02-25 Koninklijke Philips Electronics N.V. Generation of depth map for an image
US20090244072A1 (en) * 2008-03-28 2009-10-01 Vldimir Pugach Method for correct reproduction of moving spatial images on a flat screen
US8106910B2 (en) * 2008-03-28 2012-01-31 Vldimir Pugach Method for correct reproduction of moving spatial images on a flat screen
US9060120B2 (en) 2008-05-20 2015-06-16 Pelican Imaging Corporation Systems and methods for generating depth maps using images captured by camera arrays
US9055233B2 (en) 2008-05-20 2015-06-09 Pelican Imaging Corporation Systems and methods for synthesizing higher resolution images using a set of images containing a baseline image
US9094661B2 (en) 2008-05-20 2015-07-28 Pelican Imaging Corporation Systems and methods for generating depth maps using a set of images containing a baseline image
US9077893B2 (en) 2008-05-20 2015-07-07 Pelican Imaging Corporation Capturing and processing of images captured by non-grid camera arrays
US9576369B2 (en) 2008-05-20 2017-02-21 Fotonation Cayman Limited Systems and methods for generating depth maps using images captured by camera arrays incorporating cameras having different fields of view
US9191580B2 (en) 2008-05-20 2015-11-17 Pelican Imaging Corporation Capturing and processing of images including occlusions captured by camera arrays
US9060124B2 (en) 2008-05-20 2015-06-16 Pelican Imaging Corporation Capturing and processing of images using non-monolithic camera arrays
US9060121B2 (en) 2008-05-20 2015-06-16 Pelican Imaging Corporation Capturing and processing of images captured by camera arrays including cameras dedicated to sampling luma and cameras dedicated to sampling chroma
US9060142B2 (en) 2008-05-20 2015-06-16 Pelican Imaging Corporation Capturing and processing of images captured by camera arrays including heterogeneous optics
US9235898B2 (en) 2008-05-20 2016-01-12 Pelican Imaging Corporation Systems and methods for generating depth maps using light focused on an image sensor by a lens element array
US9055213B2 (en) 2008-05-20 2015-06-09 Pelican Imaging Corporation Systems and methods for measuring depth using images captured by monolithic camera arrays including at least one bayer camera
US9188765B2 (en) 2008-05-20 2015-11-17 Pelican Imaging Corporation Capturing and processing of images including occlusions focused on an image sensor by a lens stack array
US9485496B2 (en) 2008-05-20 2016-11-01 Pelican Imaging Corporation Systems and methods for measuring depth using images captured by a camera array including cameras surrounding a central camera
US9049367B2 (en) 2008-05-20 2015-06-02 Pelican Imaging Corporation Systems and methods for synthesizing higher resolution images using images captured by camera arrays
US10142560B2 (en) 2008-05-20 2018-11-27 Fotonation Limited Capturing and processing of images including occlusions focused on an image sensor by a lens stack array
US9049381B2 (en) 2008-05-20 2015-06-02 Pelican Imaging Corporation Systems and methods for normalizing image data captured by camera arrays
US9049391B2 (en) 2008-05-20 2015-06-02 Pelican Imaging Corporation Capturing and processing of near-IR images including occlusions using camera arrays incorporating near-IR light sources
US9049411B2 (en) 2008-05-20 2015-06-02 Pelican Imaging Corporation Camera arrays incorporating 3×3 imager configurations
US9041823B2 (en) 2008-05-20 2015-05-26 Pelican Imaging Corporation Systems and methods for performing post capture refocus using images captured by camera arrays
US11412158B2 (en) 2008-05-20 2022-08-09 Fotonation Limited Capturing and processing of images including occlusions focused on an image sensor by a lens stack array
US11792538B2 (en) 2008-05-20 2023-10-17 Adeia Imaging Llc Capturing and processing of images including occlusions focused on an image sensor by a lens stack array
US9712759B2 (en) 2008-05-20 2017-07-18 Fotonation Cayman Limited Systems and methods for generating depth maps using a camera arrays incorporating monochrome and color cameras
US10027901B2 (en) 2008-05-20 2018-07-17 Fotonation Cayman Limited Systems and methods for generating depth maps using a camera arrays incorporating monochrome and color cameras
US9749547B2 (en) 2008-05-20 2017-08-29 Fotonation Cayman Limited Capturing and processing of images using camera array incorperating Bayer cameras having different fields of view
US9606704B2 (en) 2008-10-23 2017-03-28 Microsoft Technology Licensing, Llc Alternative inputs of a mobile communications device
US8385952B2 (en) 2008-10-23 2013-02-26 Microsoft Corporation Mobile communications device user interface
US20100105424A1 (en) * 2008-10-23 2010-04-29 Smuga Michael A Mobile Communications Device User Interface
US8086275B2 (en) 2008-10-23 2011-12-27 Microsoft Corporation Alternative inputs of a mobile communications device
US8250494B2 (en) 2008-10-23 2012-08-21 Microsoft Corporation User interface with parallax animation
US9323424B2 (en) 2008-10-23 2016-04-26 Microsoft Corporation Column organization of content
US8970499B2 (en) 2008-10-23 2015-03-03 Microsoft Technology Licensing, Llc Alternative inputs of a mobile communications device
US8411046B2 (en) 2008-10-23 2013-04-02 Microsoft Corporation Column organization of content
US10133453B2 (en) 2008-10-23 2018-11-20 Microsoft Technology Licensing, Llc Alternative inputs of a mobile communications device
US8634876B2 (en) 2008-10-23 2014-01-21 Microsoft Corporation Location based display characteristics in a user interface
US8781533B2 (en) 2008-10-23 2014-07-15 Microsoft Corporation Alternative inputs of a mobile communications device
US8825699B2 (en) 2008-10-23 2014-09-02 Rovi Corporation Contextual search by a mobile communications device
US9223412B2 (en) 2008-10-23 2015-12-29 Rovi Technologies Corporation Location-based display characteristics in a user interface
US9223411B2 (en) 2008-10-23 2015-12-29 Microsoft Technology Licensing, Llc User interface with parallax animation
US9218067B2 (en) 2008-10-23 2015-12-22 Microsoft Technology Licensing, Llc Mobile communications device user interface
US9703452B2 (en) 2008-10-23 2017-07-11 Microsoft Technology Licensing, Llc Mobile communications device user interface
US9977575B2 (en) 2009-03-30 2018-05-22 Microsoft Technology Licensing, Llc Chromeless user interface
US8914072B2 (en) 2009-03-30 2014-12-16 Microsoft Corporation Chromeless user interface
US8175653B2 (en) 2009-03-30 2012-05-08 Microsoft Corporation Chromeless user interface
US8892170B2 (en) 2009-03-30 2014-11-18 Microsoft Corporation Unlock screen
US8238876B2 (en) 2009-03-30 2012-08-07 Microsoft Corporation Notifications
US8548431B2 (en) 2009-03-30 2013-10-01 Microsoft Corporation Notifications
US8355698B2 (en) 2009-03-30 2013-01-15 Microsoft Corporation Unlock screen
US8269736B2 (en) 2009-05-22 2012-09-18 Microsoft Corporation Drop target gestures
US8836648B2 (en) 2009-05-27 2014-09-16 Microsoft Corporation Touch pull-in gesture
US9264610B2 (en) 2009-11-20 2016-02-16 Pelican Imaging Corporation Capturing and processing of images including occlusions captured by heterogeneous camera arrays
US10306120B2 (en) 2009-11-20 2019-05-28 Fotonation Limited Capturing and processing of images captured by camera arrays incorporating cameras with telephoto and conventional lenses to generate depth maps
US9936148B2 (en) 2010-05-12 2018-04-03 Fotonation Cayman Limited Imager array interfaces
US10455168B2 (en) 2010-05-12 2019-10-22 Fotonation Limited Imager array interfaces
US11875475B2 (en) 2010-12-14 2024-01-16 Adeia Imaging Llc Systems and methods for synthesizing high resolution images using images captured by an array of independently controllable imagers
US10366472B2 (en) 2010-12-14 2019-07-30 Fotonation Limited Systems and methods for synthesizing high resolution images using images captured by an array of independently controllable imagers
US11423513B2 (en) 2010-12-14 2022-08-23 Fotonation Limited Systems and methods for synthesizing high resolution images using images captured by an array of independently controllable imagers
US8990733B2 (en) 2010-12-20 2015-03-24 Microsoft Technology Licensing, Llc Application-launching interface for multiple modes
US9430130B2 (en) 2010-12-20 2016-08-30 Microsoft Technology Licensing, Llc Customization of an immersive environment
US9696888B2 (en) 2010-12-20 2017-07-04 Microsoft Technology Licensing, Llc Application-launching interface for multiple modes
US9111326B1 (en) 2010-12-21 2015-08-18 Rawles Llc Designation of zones of interest within an augmented reality environment
US8905551B1 (en) 2010-12-23 2014-12-09 Rawles Llc Unpowered augmented reality projection accessory display device
US9864494B2 (en) 2010-12-23 2018-01-09 Microsoft Technology Licensing, Llc Application reporting in an application-selectable user interface
US8560959B2 (en) 2010-12-23 2013-10-15 Microsoft Corporation Presenting an application change through a tile
US8612874B2 (en) 2010-12-23 2013-12-17 Microsoft Corporation Presenting an application change through a tile
US9134593B1 (en) 2010-12-23 2015-09-15 Amazon Technologies, Inc. Generation and modulation of non-visible structured light for augmented reality projection system
US8689123B2 (en) 2010-12-23 2014-04-01 Microsoft Corporation Application reporting in an application-selectable user interface
US8845110B1 (en) 2010-12-23 2014-09-30 Rawles Llc Powered augmented reality projection accessory display device
US8845107B1 (en) * 2010-12-23 2014-09-30 Rawles Llc Characterization of a scene with structured light
US9766057B1 (en) 2010-12-23 2017-09-19 Amazon Technologies, Inc. Characterization of a scene with structured light
US9383831B1 (en) 2010-12-23 2016-07-05 Amazon Technologies, Inc. Powered augmented reality projection accessory display device
US9766790B2 (en) 2010-12-23 2017-09-19 Microsoft Technology Licensing, Llc Application reporting in an application-selectable user interface
US9870132B2 (en) 2010-12-23 2018-01-16 Microsoft Technology Licensing, Llc Application reporting in an application-selectable user interface
US9015606B2 (en) 2010-12-23 2015-04-21 Microsoft Technology Licensing, Llc Presenting an application change through a tile
US9213468B2 (en) 2010-12-23 2015-12-15 Microsoft Technology Licensing, Llc Application reporting in an application-selectable user interface
US9236000B1 (en) 2010-12-23 2016-01-12 Amazon Technologies, Inc. Unpowered augmented reality projection accessory display device
US10969944B2 (en) 2010-12-23 2021-04-06 Microsoft Technology Licensing, Llc Application reporting in an application-selectable user interface
US11126333B2 (en) 2010-12-23 2021-09-21 Microsoft Technology Licensing, Llc Application reporting in an application-selectable user interface
US9229918B2 (en) 2010-12-23 2016-01-05 Microsoft Technology Licensing, Llc Presenting an application change through a tile
US10031335B1 (en) 2010-12-23 2018-07-24 Amazon Technologies, Inc. Unpowered augmented reality projection accessory display device
US9721386B1 (en) 2010-12-27 2017-08-01 Amazon Technologies, Inc. Integrated augmented reality environment
US9607315B1 (en) 2010-12-30 2017-03-28 Amazon Technologies, Inc. Complementing operation of display devices in an augmented reality environment
US9508194B1 (en) 2010-12-30 2016-11-29 Amazon Technologies, Inc. Utilizing content output devices in an augmented reality environment
US9423951B2 (en) 2010-12-31 2016-08-23 Microsoft Technology Licensing, Llc Content-based snap point
FR2972817A1 (en) * 2011-03-16 2012-09-21 Christophe Lanfranchi Device for perceiving stereoscopic images e.g. three-dimensional lenticular images, in stereoscopic TV, has mask provided at specific distance from stereoscopic images, where mask is utilized for masking circumference of images for observer
US9383917B2 (en) 2011-03-28 2016-07-05 Microsoft Technology Licensing, Llc Predictive tiling
US10218889B2 (en) 2011-05-11 2019-02-26 Fotonation Limited Systems and methods for transmitting and receiving array camera image data
US9866739B2 (en) 2011-05-11 2018-01-09 Fotonation Cayman Limited Systems and methods for transmitting and receiving array camera image data
US10742861B2 (en) 2011-05-11 2020-08-11 Fotonation Limited Systems and methods for transmitting and receiving array camera image data
US9104440B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
US10303325B2 (en) 2011-05-27 2019-05-28 Microsoft Technology Licensing, Llc Multi-application environment
US9329774B2 (en) 2011-05-27 2016-05-03 Microsoft Technology Licensing, Llc Switching back to a previously-interacted-with application
US9104307B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
US11272017B2 (en) 2011-05-27 2022-03-08 Microsoft Technology Licensing, Llc Application notifications manifest
US9158445B2 (en) 2011-05-27 2015-10-13 Microsoft Technology Licensing, Llc Managing an immersive interface in a multi-application immersive environment
US8893033B2 (en) 2011-05-27 2014-11-18 Microsoft Corporation Application notifications
US9535597B2 (en) 2011-05-27 2017-01-03 Microsoft Technology Licensing, Llc Managing an immersive interface in a multi-application immersive environment
US9052820B2 (en) 2011-05-27 2015-06-09 Microsoft Technology Licensing, Llc Multi-application environment
US11698721B2 (en) 2011-05-27 2023-07-11 Microsoft Technology Licensing, Llc Managing an immersive interface in a multi-application immersive environment
US9658766B2 (en) 2011-05-27 2017-05-23 Microsoft Technology Licensing, Llc Edge gesture
US9516222B2 (en) 2011-06-28 2016-12-06 Kip Peli P1 Lp Array cameras incorporating monolithic array camera modules with high MTF lens stacks for capture of images used in super-resolution processing
US9578237B2 (en) 2011-06-28 2017-02-21 Fotonation Cayman Limited Array cameras incorporating optics with modulation transfer functions greater than sensor Nyquist frequency for capture of images used in super-resolution processing
US8687023B2 (en) 2011-08-02 2014-04-01 Microsoft Corporation Cross-slide gesture to select and rearrange
US10579250B2 (en) 2011-09-01 2020-03-03 Microsoft Technology Licensing, Llc Arranging tiles
US8935631B2 (en) 2011-09-01 2015-01-13 Microsoft Corporation Arranging tiles
US9557909B2 (en) 2011-09-09 2017-01-31 Microsoft Technology Licensing, Llc Semantic zoom linguistic helpers
US10353566B2 (en) 2011-09-09 2019-07-16 Microsoft Technology Licensing, Llc Semantic zoom animations
US8922575B2 (en) 2011-09-09 2014-12-30 Microsoft Corporation Tile cache
US10114865B2 (en) 2011-09-09 2018-10-30 Microsoft Technology Licensing, Llc Tile cache
US10254955B2 (en) 2011-09-10 2019-04-09 Microsoft Technology Licensing, Llc Progressively indicating new content in an application-selectable user interface
US8830270B2 (en) 2011-09-10 2014-09-09 Microsoft Corporation Progressively indicating new content in an application-selectable user interface
US9146670B2 (en) 2011-09-10 2015-09-29 Microsoft Technology Licensing, Llc Progressively indicating new content in an application-selectable user interface
US9244802B2 (en) 2011-09-10 2016-01-26 Microsoft Technology Licensing, Llc Resource user interface
US8933952B2 (en) 2011-09-10 2015-01-13 Microsoft Corporation Pre-rendering new content for an application-selectable user interface
US9118782B1 (en) 2011-09-19 2015-08-25 Amazon Technologies, Inc. Optical interference mitigation
US10375302B2 (en) 2011-09-19 2019-08-06 Fotonation Limited Systems and methods for controlling aliasing in images captured by an array camera for use in super resolution processing using pixel apertures
US9794476B2 (en) 2011-09-19 2017-10-17 Fotonation Cayman Limited Systems and methods for controlling aliasing in images captured by an array camera for use in super resolution processing using pixel apertures
US9025895B2 (en) 2011-09-28 2015-05-05 Pelican Imaging Corporation Systems and methods for decoding refocusable light field image files
US9864921B2 (en) 2011-09-28 2018-01-09 Fotonation Cayman Limited Systems and methods for encoding image files containing depth maps stored as metadata
US20180197035A1 (en) 2011-09-28 2018-07-12 Fotonation Cayman Limited Systems and Methods for Encoding Image Files Containing Depth Maps Stored as Metadata
US9042667B2 (en) 2011-09-28 2015-05-26 Pelican Imaging Corporation Systems and methods for decoding light field image files using a depth map
US9129183B2 (en) 2011-09-28 2015-09-08 Pelican Imaging Corporation Systems and methods for encoding light field image files
US10430682B2 (en) 2011-09-28 2019-10-01 Fotonation Limited Systems and methods for decoding image files containing depth maps stored as metadata
US10984276B2 (en) 2011-09-28 2021-04-20 Fotonation Limited Systems and methods for encoding image files containing depth maps stored as metadata
US11729365B2 (en) 2011-09-28 2023-08-15 Adela Imaging LLC Systems and methods for encoding image files containing depth maps stored as metadata
US9031335B2 (en) 2011-09-28 2015-05-12 Pelican Imaging Corporation Systems and methods for encoding light field image files having depth and confidence maps
US9031343B2 (en) 2011-09-28 2015-05-12 Pelican Imaging Corporation Systems and methods for encoding light field image files having a depth map
US9036931B2 (en) 2011-09-28 2015-05-19 Pelican Imaging Corporation Systems and methods for decoding structured light field image files
US10019816B2 (en) 2011-09-28 2018-07-10 Fotonation Cayman Limited Systems and methods for decoding image files containing depth maps stored as metadata
US9025894B2 (en) 2011-09-28 2015-05-05 Pelican Imaging Corporation Systems and methods for decoding light field image files having depth and confidence maps
US9811753B2 (en) 2011-09-28 2017-11-07 Fotonation Cayman Limited Systems and methods for encoding light field image files
US10275676B2 (en) 2011-09-28 2019-04-30 Fotonation Limited Systems and methods for encoding image files containing depth maps stored as metadata
US9536166B2 (en) 2011-09-28 2017-01-03 Kip Peli P1 Lp Systems and methods for decoding image files containing depth maps stored as metadata
US9223472B2 (en) 2011-12-22 2015-12-29 Microsoft Technology Licensing, Llc Closing applications
US10191633B2 (en) 2011-12-22 2019-01-29 Microsoft Technology Licensing, Llc Closing applications
US9128605B2 (en) 2012-02-16 2015-09-08 Microsoft Technology Licensing, Llc Thumbnail-image selection of applications
US10311649B2 (en) 2012-02-21 2019-06-04 Fotonation Limited Systems and method for performing depth based image editing
US20130215108A1 (en) * 2012-02-21 2013-08-22 Pelican Imaging Corporation Systems and Methods for the Manipulation of Captured Light Field Image Data
US9754422B2 (en) 2012-02-21 2017-09-05 Fotonation Cayman Limited Systems and method for performing depth based image editing
US9412206B2 (en) * 2012-02-21 2016-08-09 Pelican Imaging Corporation Systems and methods for the manipulation of captured light field image data
US9210392B2 (en) 2012-05-01 2015-12-08 Pelican Imaging Coporation Camera modules patterned with pi filter groups
US9706132B2 (en) 2012-05-01 2017-07-11 Fotonation Cayman Limited Camera modules patterned with pi filter groups
US9100635B2 (en) 2012-06-28 2015-08-04 Pelican Imaging Corporation Systems and methods for detecting defective camera arrays and optic arrays
US10334241B2 (en) 2012-06-28 2019-06-25 Fotonation Limited Systems and methods for detecting defective camera arrays and optic arrays
US9807382B2 (en) 2012-06-28 2017-10-31 Fotonation Cayman Limited Systems and methods for detecting defective camera arrays and optic arrays
US10261219B2 (en) 2012-06-30 2019-04-16 Fotonation Limited Systems and methods for manufacturing camera modules using active alignment of lens stack arrays and sensors
US11022725B2 (en) 2012-06-30 2021-06-01 Fotonation Limited Systems and methods for manufacturing camera modules using active alignment of lens stack arrays and sensors
US9766380B2 (en) 2012-06-30 2017-09-19 Fotonation Cayman Limited Systems and methods for manufacturing camera modules using active alignment of lens stack arrays and sensors
US9123117B2 (en) 2012-08-21 2015-09-01 Pelican Imaging Corporation Systems and methods for generating depth maps and corresponding confidence maps indicating depth estimation reliability
US9129377B2 (en) 2012-08-21 2015-09-08 Pelican Imaging Corporation Systems and methods for measuring depth based upon occlusion patterns in images
US9235900B2 (en) 2012-08-21 2016-01-12 Pelican Imaging Corporation Systems and methods for estimating depth and visibility from a reference viewpoint for pixels in a set of images captured from different viewpoints
US9240049B2 (en) 2012-08-21 2016-01-19 Pelican Imaging Corporation Systems and methods for measuring depth using an array of independently controllable cameras
US9858673B2 (en) 2012-08-21 2018-01-02 Fotonation Cayman Limited Systems and methods for estimating depth and visibility from a reference viewpoint for pixels in a set of images captured from different viewpoints
US9147254B2 (en) 2012-08-21 2015-09-29 Pelican Imaging Corporation Systems and methods for measuring depth in the presence of occlusions using a subset of images
US9123118B2 (en) 2012-08-21 2015-09-01 Pelican Imaging Corporation System and methods for measuring depth using an array camera employing a bayer filter
US10380752B2 (en) 2012-08-21 2019-08-13 Fotonation Limited Systems and methods for estimating depth and visibility from a reference viewpoint for pixels in a set of images captured from different viewpoints
US10462362B2 (en) 2012-08-23 2019-10-29 Fotonation Limited Feature based high resolution motion estimation from low resolution images captured using an array source
US9813616B2 (en) 2012-08-23 2017-11-07 Fotonation Cayman Limited Feature based high resolution motion estimation from low resolution images captured using an array source
US9214013B2 (en) 2012-09-14 2015-12-15 Pelican Imaging Corporation Systems and methods for correcting user identified artifacts in light field images
US10390005B2 (en) 2012-09-28 2019-08-20 Fotonation Limited Generating images from light fields utilizing virtual viewpoints
US9749568B2 (en) 2012-11-13 2017-08-29 Fotonation Cayman Limited Systems and methods for array camera focal plane control
US9143711B2 (en) 2012-11-13 2015-09-22 Pelican Imaging Corporation Systems and methods for array camera focal plane control
US9462164B2 (en) 2013-02-21 2016-10-04 Pelican Imaging Corporation Systems and methods for generating compressed light field representation data using captured light fields, array geometry, and parallax information
US10009538B2 (en) 2013-02-21 2018-06-26 Fotonation Cayman Limited Systems and methods for generating compressed light field representation data using captured light fields, array geometry, and parallax information
US9253380B2 (en) 2013-02-24 2016-02-02 Pelican Imaging Corporation Thin form factor computational array cameras and modular array cameras
US9774831B2 (en) 2013-02-24 2017-09-26 Fotonation Cayman Limited Thin form factor computational array cameras and modular array cameras
US9743051B2 (en) 2013-02-24 2017-08-22 Fotonation Cayman Limited Thin form factor computational array cameras and modular array cameras
US9374512B2 (en) 2013-02-24 2016-06-21 Pelican Imaging Corporation Thin form factor computational array cameras and modular array cameras
US9917998B2 (en) 2013-03-08 2018-03-13 Fotonation Cayman Limited Systems and methods for measuring scene information while capturing images using array cameras
US9774789B2 (en) 2013-03-08 2017-09-26 Fotonation Cayman Limited Systems and methods for high dynamic range imaging using array cameras
US11570423B2 (en) 2013-03-10 2023-01-31 Adeia Imaging Llc System and methods for calibration of an array camera
US10958892B2 (en) 2013-03-10 2021-03-23 Fotonation Limited System and methods for calibration of an array camera
US10225543B2 (en) 2013-03-10 2019-03-05 Fotonation Limited System and methods for calibration of an array camera
US9986224B2 (en) 2013-03-10 2018-05-29 Fotonation Cayman Limited System and methods for calibration of an array camera
US11272161B2 (en) 2013-03-10 2022-03-08 Fotonation Limited System and methods for calibration of an array camera
US9888194B2 (en) 2013-03-13 2018-02-06 Fotonation Cayman Limited Array camera architecture implementing quantum film image sensors
US9741118B2 (en) 2013-03-13 2017-08-22 Fotonation Cayman Limited System and methods for calibration of an array camera
US10127682B2 (en) 2013-03-13 2018-11-13 Fotonation Limited System and methods for calibration of an array camera
US9800856B2 (en) 2013-03-13 2017-10-24 Fotonation Cayman Limited Systems and methods for synthesizing images from image data captured by an array camera using restricted depth of field depth maps in which depth estimation precision varies
US9733486B2 (en) 2013-03-13 2017-08-15 Fotonation Cayman Limited Systems and methods for controlling aliasing in images captured by an array camera for use in super-resolution processing
US9787911B2 (en) 2013-03-14 2017-10-10 Fotonation Cayman Limited Systems and methods for photometric normalization in array cameras
US9100586B2 (en) 2013-03-14 2015-08-04 Pelican Imaging Corporation Systems and methods for photometric normalization in array cameras
US10547772B2 (en) 2013-03-14 2020-01-28 Fotonation Limited Systems and methods for reducing motion blur in images or video in ultra low light with array cameras
US10412314B2 (en) 2013-03-14 2019-09-10 Fotonation Limited Systems and methods for photometric normalization in array cameras
US10091405B2 (en) 2013-03-14 2018-10-02 Fotonation Cayman Limited Systems and methods for reducing motion blur in images or video in ultra low light with array cameras
US9578259B2 (en) 2013-03-14 2017-02-21 Fotonation Cayman Limited Systems and methods for reducing motion blur in images or video in ultra low light with array cameras
US10674138B2 (en) 2013-03-15 2020-06-02 Fotonation Limited Autofocus system for a conventional camera that uses depth information from an array camera
US9497370B2 (en) 2013-03-15 2016-11-15 Pelican Imaging Corporation Array camera architecture implementing quantum dot color filters
US10182216B2 (en) 2013-03-15 2019-01-15 Fotonation Limited Extended color processing on pelican array cameras
US9800859B2 (en) 2013-03-15 2017-10-24 Fotonation Cayman Limited Systems and methods for estimating depth using stereo array cameras
US10542208B2 (en) 2013-03-15 2020-01-21 Fotonation Limited Systems and methods for synthesizing high resolution images using image deconvolution based on motion and depth information
US9438888B2 (en) 2013-03-15 2016-09-06 Pelican Imaging Corporation Systems and methods for stereo imaging with camera arrays
US9955070B2 (en) 2013-03-15 2018-04-24 Fotonation Cayman Limited Systems and methods for synthesizing high resolution images using image deconvolution based on motion and depth information
US10122993B2 (en) 2013-03-15 2018-11-06 Fotonation Limited Autofocus system for a conventional camera that uses depth information from an array camera
US9633442B2 (en) 2013-03-15 2017-04-25 Fotonation Cayman Limited Array cameras including an array camera module augmented with a separate camera
US9497429B2 (en) 2013-03-15 2016-11-15 Pelican Imaging Corporation Extended color processing on pelican array cameras
US10455218B2 (en) 2013-03-15 2019-10-22 Fotonation Limited Systems and methods for estimating depth using stereo array cameras
US9602805B2 (en) 2013-03-15 2017-03-21 Fotonation Cayman Limited Systems and methods for estimating depth using ad hoc stereo array cameras
US10638099B2 (en) 2013-03-15 2020-04-28 Fotonation Limited Extended color processing on pelican array cameras
US20140268324A1 (en) * 2013-03-18 2014-09-18 3-D Virtual Lens Technologies, Llc Method of displaying 3d images from 2d source images using a barrier grid
US9807081B2 (en) 2013-05-29 2017-10-31 Microsoft Technology Licensing, Llc Live tiles without application-code execution
US9450952B2 (en) 2013-05-29 2016-09-20 Microsoft Technology Licensing, Llc Live tiles without application-code execution
US10110590B2 (en) 2013-05-29 2018-10-23 Microsoft Technology Licensing, Llc Live tiles without application-code execution
US20160131916A1 (en) * 2013-07-01 2016-05-12 Roland Wolf Device and method for generating a spatial depth effect of image information present in a 2d image area
US10540806B2 (en) 2013-09-27 2020-01-21 Fotonation Limited Systems and methods for depth-assisted perspective distortion correction
US9898856B2 (en) 2013-09-27 2018-02-20 Fotonation Cayman Limited Systems and methods for depth-assisted perspective distortion correction
US9426343B2 (en) 2013-11-07 2016-08-23 Pelican Imaging Corporation Array cameras incorporating independently aligned lens stacks
US9185276B2 (en) 2013-11-07 2015-11-10 Pelican Imaging Corporation Methods of manufacturing array camera modules incorporating independently aligned lens stacks
US9264592B2 (en) 2013-11-07 2016-02-16 Pelican Imaging Corporation Array camera modules incorporating independently aligned lens stacks
US9924092B2 (en) 2013-11-07 2018-03-20 Fotonation Cayman Limited Array cameras incorporating independently aligned lens stacks
US10767981B2 (en) 2013-11-18 2020-09-08 Fotonation Limited Systems and methods for estimating depth from projected texture using camera arrays
US11486698B2 (en) 2013-11-18 2022-11-01 Fotonation Limited Systems and methods for estimating depth from projected texture using camera arrays
US10119808B2 (en) 2013-11-18 2018-11-06 Fotonation Limited Systems and methods for estimating depth from projected texture using camera arrays
US9426361B2 (en) 2013-11-26 2016-08-23 Pelican Imaging Corporation Array camera configurations incorporating multiple constituent array cameras
US9456134B2 (en) 2013-11-26 2016-09-27 Pelican Imaging Corporation Array camera configurations incorporating constituent array cameras and constituent cameras
US10708492B2 (en) 2013-11-26 2020-07-07 Fotonation Limited Array camera configurations incorporating constituent array cameras and constituent cameras
US9813617B2 (en) 2013-11-26 2017-11-07 Fotonation Cayman Limited Array camera configurations incorporating constituent array cameras and constituent cameras
US10089740B2 (en) 2014-03-07 2018-10-02 Fotonation Limited System and methods for depth regularization and semiautomatic interactive matting using RGB-D images
US10574905B2 (en) 2014-03-07 2020-02-25 Fotonation Limited System and methods for depth regularization and semiautomatic interactive matting using RGB-D images
US9841874B2 (en) 2014-04-04 2017-12-12 Microsoft Technology Licensing, Llc Expandable application representation
US10459607B2 (en) 2014-04-04 2019-10-29 Microsoft Technology Licensing, Llc Expandable application representation
US9247117B2 (en) 2014-04-07 2016-01-26 Pelican Imaging Corporation Systems and methods for correcting for warpage of a sensor array in an array camera module by introducing warpage into a focal plane of a lens stack array
US9769293B2 (en) 2014-04-10 2017-09-19 Microsoft Technology Licensing, Llc Slider cover for computing device
US9451822B2 (en) 2014-04-10 2016-09-27 Microsoft Technology Licensing, Llc Collapsible shell cover for computing device
US9521319B2 (en) 2014-06-18 2016-12-13 Pelican Imaging Corporation Array cameras and array camera modules including spectral filters disposed outside of a constituent image sensor
US10254942B2 (en) 2014-07-31 2019-04-09 Microsoft Technology Licensing, Llc Adaptive sizing and positioning of application windows
US10678412B2 (en) 2014-07-31 2020-06-09 Microsoft Technology Licensing, Llc Dynamic joint dividers for application windows
US10592080B2 (en) 2014-07-31 2020-03-17 Microsoft Technology Licensing, Llc Assisted presentation of application windows
US10642365B2 (en) 2014-09-09 2020-05-05 Microsoft Technology Licensing, Llc Parametric inertia and APIs
US11546576B2 (en) 2014-09-29 2023-01-03 Adeia Imaging Llc Systems and methods for dynamic calibration of array cameras
US10250871B2 (en) 2014-09-29 2019-04-02 Fotonation Limited Systems and methods for dynamic calibration of array cameras
US9674335B2 (en) 2014-10-30 2017-06-06 Microsoft Technology Licensing, Llc Multi-configuration input device
US9942474B2 (en) 2015-04-17 2018-04-10 Fotonation Cayman Limited Systems and methods for performing high speed video capture and depth estimation using array cameras
US10818026B2 (en) 2017-08-21 2020-10-27 Fotonation Limited Systems and methods for hybrid depth regularization
US10482618B2 (en) 2017-08-21 2019-11-19 Fotonation Limited Systems and methods for hybrid depth regularization
US11562498B2 (en) 2017-08-21 2023-01-24 Adela Imaging LLC Systems and methods for hybrid depth regularization
CN109257540A (en) * 2018-11-05 2019-01-22 浙江舜宇光学有限公司 Take the photograph photography bearing calibration and the camera of lens group more
US11270110B2 (en) 2019-09-17 2022-03-08 Boston Polarimetrics, Inc. Systems and methods for surface modeling using polarization cues
US11699273B2 (en) 2019-09-17 2023-07-11 Intrinsic Innovation Llc Systems and methods for surface modeling using polarization cues
US11525906B2 (en) 2019-10-07 2022-12-13 Intrinsic Innovation Llc Systems and methods for augmentation of sensor systems and imaging systems with polarization
US11302012B2 (en) 2019-11-30 2022-04-12 Boston Polarimetrics, Inc. Systems and methods for transparent object segmentation using polarization cues
US11842495B2 (en) 2019-11-30 2023-12-12 Intrinsic Innovation Llc Systems and methods for transparent object segmentation using polarization cues
US11580667B2 (en) 2020-01-29 2023-02-14 Intrinsic Innovation Llc Systems and methods for characterizing object pose detection and measurement systems
US11797863B2 (en) 2020-01-30 2023-10-24 Intrinsic Innovation Llc Systems and methods for synthesizing data for training statistical models on different imaging modalities including polarized images
US11683594B2 (en) 2021-04-15 2023-06-20 Intrinsic Innovation Llc Systems and methods for camera exposure control
US11290658B1 (en) 2021-04-15 2022-03-29 Boston Polarimetrics, Inc. Systems and methods for camera exposure control
US11954886B2 (en) 2021-04-15 2024-04-09 Intrinsic Innovation Llc Systems and methods for six-degree of freedom pose estimation of deformable objects
US11953700B2 (en) 2021-05-27 2024-04-09 Intrinsic Innovation Llc Multi-aperture polarization optical systems using beam splitters
US11689813B2 (en) 2021-07-01 2023-06-27 Intrinsic Innovation Llc Systems and methods for high dynamic range imaging using crossed polarizers

Also Published As

Publication number Publication date
CN101133360A (en) 2008-02-27
US7073908B1 (en) 2006-07-11
ZA200705671B (en) 2008-10-29
CL2007002009A1 (en) 2008-06-06

Similar Documents

Publication Publication Date Title
US7073908B1 (en) Enhancement of depth perception
US7086735B1 (en) Enhancement of visual perception
US7612795B2 (en) Enhancement of visual perception III
US7440004B2 (en) 3-D imaging arrangements
US8531507B2 (en) Enhancement of visual perception IV
US7545405B2 (en) Enhancement of visual perception II
US7907150B2 (en) Method of fusion or merging imagery data for improved visual perception using monoscopic and stereographic fusion and retinal decay techniques
US8115803B2 (en) Apparatus and method for projecting spatial image
US10616567B1 (en) Frustum change in projection stereo rendering
KR20120095508A (en) Sunglasses combined use three dimensions spectacles lens
EP1836531A2 (en) Enhancement of visual perception
JP2008527918A5 (en)
Faubert Motion parallax, stereoscopy, and the perception of depth: Practical and theoretical issues
JP4284158B2 (en) Stereoscopic two-dimensional image display system and image display method
Serrano-Pedraza et al. A specialization for vertical disparity discontinuities
US20130044104A1 (en) Methods and Apparatus for Displaying an Image with Enhanced Depth Effect
US8131062B2 (en) Method of processing an image
Wang et al. Clarifying how defocus blur and disparity affect the perceived depth
Clapp Stereoscopic Perception
AU2004226624B2 (en) Image processing
Banks I3. 1: invited paper: the importance of focus cues in stereo 3d displays
JP2017032802A (en) Exhibition set and exhibition method
MX2009008484A (en) 3d peripheral and stereoscopic vision goggles.
Charman et al. Accommodation to perceived depth in stereotests
CN105824127A (en) 3D display device

Legal Events

Date Code Title Description
STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

SULP Surcharge for late payment
REMI Maintenance fee reminder mailed
FPAY Fee payment

Year of fee payment: 8

SULP Surcharge for late payment

Year of fee payment: 7

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.)

FEPP Fee payment procedure

Free format text: 11.5 YR SURCHARGE- LATE PMT W/IN 6 MO, SMALL ENTITY (ORIGINAL EVENT CODE: M2556)

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2553)

Year of fee payment: 12