US20140361988A1 - Touch Free Interface for Augmented Reality Systems - Google Patents

Touch Free Interface for Augmented Reality Systems Download PDF

Info

Publication number
US20140361988A1
US20140361988A1 US14/345,592 US201214345592A US2014361988A1 US 20140361988 A1 US20140361988 A1 US 20140361988A1 US 201214345592 A US201214345592 A US 201214345592A US 2014361988 A1 US2014361988 A1 US 2014361988A1
Authority
US
United States
Prior art keywords
processor
visual data
user
predefined
augmented reality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/345,592
Inventor
Itay Katz
Amnon Shenfeld
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Eyesight Mobile Technologies Ltd
Original Assignee
Eyesight Mobile Technologies Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Eyesight Mobile Technologies Ltd filed Critical Eyesight Mobile Technologies Ltd
Priority to US14/345,592 priority Critical patent/US20140361988A1/en
Assigned to EYESIGHT MOBILE TECHNOLOGIES LTD. reassignment EYESIGHT MOBILE TECHNOLOGIES LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SHENFELD, AMNON, KATZ, ITAY
Publication of US20140361988A1 publication Critical patent/US20140361988A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]

Definitions

  • the present invention relates to methods and systems for augmented reality.
  • Augmented reality is a term for a live, direct or an indirect, view of a physical, real-world environment whose elements are augmented by computer-generated information such as text, sound, video, graphics or GPS data. Artificial information about the environment and its objects is thus overlaid on a real world view or image. Augmentation is typically in real-time and in semantic context with environmental elements so that information about the surrounding real world of the user becomes interactive and digitally manipulatable.
  • the main hardware components for augmented reality are a processor, display, sensors and input devices. These elements, specifically a CPU, display, camera and MEMS sensors such as accelerometer, GPS, or solid state compass are present in portable device such as smartphones, which allow them to function as augmented reality platforms.
  • Augmented reality systems have found applications in entrainment, navigation, assembly processes, maintenance, medical procedures.
  • Portable augmented reality systems have also found applications in tourism and sightseeing where augmented reality is used to present information of real world objects and places objects being viewed.
  • An immersive augmented reality experience is provided using a head-mounted display, typically in the form of goggles or a helmet.
  • a head-mounted display virtual visual objects are superimposed on the user's view of a real world scene.
  • the head mounted display is tracked with sensors that allow the system to align virtual information with the physical world.
  • the tracking may be performed, for example, using any one or more of such technologies as digital cameras or other optical sensors, accelerometers, GPS, gyroscopes, solid state compasses, RFID and wireless sensors.
  • Head-mounted displays are either optical see-through or video see-through.
  • Optical see-through employs solutions such as half-silver mirrors to pass images through the lens and overlay information to be reflected into the user's eyes, and transparent LCD projectors that display the digital information and images directly or indirectly to the user retina.
  • the present invention provides an interactive system for augmented reality.
  • the interactive system of the invention includes a wearable data display device that may be incorporated for example, into a pair of glasses or goggles.
  • the wearable display has a device providing location extraction capabilities (such as GPS) and a compass.
  • the system also includes a user interface that allows a user to select computer generated data to augment a real world scene that the user is viewing.
  • a camera obtains images of the real-world scene being viewed.
  • a processor detects a predefined object in images of the real world scene captured by the camera such as a user's finger. When the user points to an element in the scene, data relating to the element are displayed on the data display device and are superimposed on the user's view of the scene.
  • the invention provides a method for augmented reality comprising:
  • the image sensor may be selected from a camera a light sensor, an IR sensor, an ultrasonic sensor, a proximity sensor, a CMOS image sensor, a shortwave infrared (SWIR) image sensor or a reflectivity sensor, an IR sensor, an ultrasonic sensor, a proximity sensor, and a reflectivity sensor.
  • One or more of the state sensors may be selected from an optical sensor, an accelerometer, GPS, a gyroscope, a compass, magnetic sensor, a sensor indicating the direction of the device relative to the Earth's magnetic field, a gravity sensor and an RFID detector.
  • the data associated with the identified object may be obtained by searching in a memory for data associated with the real world object.
  • the predefined object may be, for example, a hand, a part of a hand, two hands, parts of two hands, a finger, part of a finger, or a finger tip.
  • the viewing device may be configured to be worn by a user, for example, glasses or goggles.
  • the viewing device may be incorporated in a mobile communication device.
  • the step of identifying in the images of the real world scene obtained by the image sensor or sensors may comprise determining a location (X,Y) of the predefined object in an image obtained by the image sensors and determining one or both of location and an orientation of the display device provided by the sensors.
  • the method of the invention may further comprise communicating with an external device or website.
  • the communication may comprise sending a message to an application running on the external device, a service running on the external device, an operating system running on the external device, a process running on the external device, one or more applications running on a processor of the external device, a software program running in the background of the external device, or to one or more services running on the external device.
  • the method may further comprise sending a message to an application running on the mobile communication device, a service running on the mobile communication device, an operating system running on the mobile communication device, a process running on the mobile communication device, one or more applications running on a processor of the mobile communication device, a software program running in the background of the mobile communication device, or to one or more services running on the mobile communication device.
  • the method may further comprise sending a message requesting a data relating to a real world object identified in an image from an application running on the external device, a service running on the external device, an operating system running on the external device, a process running on the external device, one or more applications running on a processor of the external device, a software program running in the background of the external device, or to one or more services running on the external device.
  • the method may further comprise sending a message requesting a data relating to a real world object identified in an image from an application running on the mobile communication device, a service running on the mobile communication device, an operating system running on the mobile communication device, a process running on the mobile communication device, one or more applications running on a processor of the mobile communication device, a software program running in the background of the mobile communication device, or to one or more services running on the mobile communication device.
  • the message to the external device or website may be a command.
  • the command may be selected from a command to run an application on the external device or website, a command to stop an application running on the external device or website, a command to activate a service running on the external device or website, a command to stop a service running on the external device or website, or a command to send data relating to a real world object identified in an image.
  • the message to the mobile communication device may be a command.
  • the command may be selected from a command to run an application on the mobile communication device, a command to stop an application running on the mobile communication device or website, a command to activate a service running on the mobile communication device e, a command to stop a service running on the mobile communication device, or a command to send data relating to a real world object identified in an image.
  • the method may further comprise receiving from the external device or website data relating to a real world object identified in an image and presenting the received data to a user.
  • the communication with the external device or website may be over a communication network.
  • the command to the external device may be selected from depressing a virtual key displayed on a display device of the external device; rotating a selection carousel; switching between desktops, running on the external device a predefined software application; turning off an application on the external device; turning speakers on or off; turning volume up or down; locking the external device, unlocking the external device, skipping to another track in a media player or between IPTV channels; controlling a navigation application; initiating a call, ending a call, presenting a notification, displaying a notification; navigating in a photo or music album gallery, scrolling web-pages, presenting an email, presenting one or more documents or maps, controlling actions in a game, pointing at a map, zooming-in or out on a map or images, painting on an image, grasping an activatable icon and pulling the activatable icon out form the display device, rotating an activatable icon, emulating touch commands on the external device, performing one or more multi-touch commands, a touch gesture command, typing, clicking on a
  • the predefined gesture may be selected from a swiping motion, a pinching motion of two fingers, pointing, a left to right gesture, a right to left gesture, an upwards gesture, a downwards gesture, a pushing gesture, opening a clenched fist, opening a clenched first and moving towards the image sensor, a tapping gesture, a waving gesture, a clapping gesture, a reverse clapping gesture, closing a hand into a fist, a pinching gesture, a reverse pinching gesture, a gesture of splaying fingers on a hand, a reverse gesture of splaying fingers on a hand, pointing at an activatable icon, holding an activating object for a predefined amount of time, clicking on an activatable icon, double clicking on an activatable icon, clicking from the right side on an activatable icon, clicking from the left side on an activatable icon, clicking from the bottom on an activatable icon, clicking from the top on an activatable
  • the data associated with the identified object may be any one or more of visual data, audio data, or textual data.
  • the data associated with the identified object may be an activatable icon.
  • the activatable icon may be a 2D or 3D activatable icon.
  • the activatable icon may be perceived by a user in a 3D space in front of the user.
  • the method of the invention may have two or more operational modes.
  • the method may change the operational mode of the system upon identification of a predefined gesture.
  • An operational mode may be specified by any one or more of the gestures to be identified, algorithms that are active on the gesture detection module; a resolution of images captured by the image sensor, and a capture rate of images captured by the image sensor, the level of details of the data to be presented, the activatable icons to be presented to the user, a source of the data to be presented, a level of details of the data to be presented, activatable icons to be displayed on the display device, an active on-line service.
  • the operational mode may be a mode selected from a mode of video recording of images by the image sensor upon identification of a predefined gesture; a mode of recording sounds by a microphone upon identification of a predefined gesture and to stop recording upon identification of another predefined gesture; a mode of continuously monitoring video or sound and following a detection of a predefined gesture, recording the video or sound starting from a predefined amount of time prior to identification of the gesture, and stopping the recording after identification of another predefined gesture, a mode of adding tags in a captured and real-time recorded video upon identification of a predefined gesture am mode of selecting an area in the field of view as captured by the camera, and copying the area to another location in the field of view and resizing it, a mode employing a tracker on a selected area in an image and is presenting the selected area in real-time in the resized and relocated area on the display device, a mode of capturing an image upon identification of a predefined gesture.
  • the method of the invention may further comprise running a tracking algorithm that tracks the identified real world object and maintains the displayed associated visual data in a fixed position relative to the identified real world object.
  • An object recognition module may be employed to detect the predefined object only when the display device has level of motion below a predetermined threshold.
  • the method may further comprise providing feedback when a predefined gesture has been identified.
  • the feedback may be, for example, visual feedback, audio feedback, haptic feedback, directional vibration, air tactile feedback, or ultrasonic feedback.
  • the feedback may be a visual indication in a form selected from an activatable icon displayed on the display device, a change in an activatable icon displayed on the display device, a change in color of an activatable icon displayed on the display device, a change in size of an activatable icon displayed on the display device, animation of an activatable icon displayed on the display device, an indication light, an indicator moving on a display device, an indicator moving on the display device that appears on top of all other images or video appearing on the display device and the appearance of a glow around the predefined object.
  • the feedback may be a vibration, a directional vibration indication, or an air tactile indication.
  • part of an activatable icon displayed on the display device may not presented where the predefined object is located, so that the predefined object appears to be on top of the activatable icon.
  • Activatable icons may be removed from the display device when the display device has a level of activity above a predefined threshold.
  • the removed icons on the display device may be removed, for example, when the display device has a level of motion below the predefined threshold.
  • the method may be brought into an active mode when a predefined action is performed.
  • the predefined action may be selected from bringing the predefined object into the field of view from below, when user place the predefined object in certain location or posse, such as pointing on the bottom right corner of the camera field of view or open his hand in the camera field of view, perform a predefined gesture such as moving the hand from right to left across the field of view, when an activatable icon is displayed and the user performs a predefined gesture correlate to the activatable icon such as pointing on the activatable icon, or perform a waving gesture in the location where the activatable icon is presented, or sliding the floating activatable icon from one location to the other by perform a gesture in the 3D space where the activatable icon is perceived to be located, by touching the device, or tapping on the device if the device is provided with an accelerometer.
  • the system may enter the active mode when the user passes a hand near the device if the device is provided with a proximity sensor, or ultrasonic sensor.
  • the system may also be activated by a voice command, or when the user places the predefined object in a particular location in the field of view.
  • the system may enter the active mode only when there is relevant data associated with the real world in the field of view of the user. When the system may indicate to the user when there is a relevant data to be presented, or when it is ready for interaction.
  • the method of the invention may further comprise attaching a visual indication to a real-world object indicating the existence in a memory of data correlated with the real-world object.
  • the visual indication may be overlaid on an image of the real-world object.
  • the visual may be selected from an activatable icon, a photo, and an image of an envelope.
  • the method of the invention may further comprise a calibration process to record one or more physical parameters of the predefined object.
  • the calibration process may comprise any one or more steps selected from presenting on the display activatable icons in different locations in a 3D space, extracting physical features of the predefined object, and determining a correlation between dimensions of the predefined object and its distance from the camera.
  • the calibration process may comprise a step of constructing a triangle having vertices at one of the image sensors and at a tip of the predefined object and having a side formed by a user's line of sight. The distance of the real world object from the camera may be estimated based on information extracted in the calibration.
  • the method may further comprise displaying a keyboard enabling text typing.
  • the keyboard may be displayed upon detection of a predefined gesture, such as a gesture from right to left, presenting an open hand, presenting two open hands in a predefined region of the field of view of an image sensor.
  • the keyboard may be displayed upon performing a click gesture in a 3D typing area or where a predefined activatable icon is perceived to be located.
  • the invention also provides a system comprising a device configured to execute the method of the invention.
  • the invention also provides a computer program comprising computer program code means for performing all the steps of the method of the invention when said program is run on a computer.
  • the computer program may be embodied on a computer readable medium.
  • a user may interact with a visual image typically displayed through glasses.
  • the user's view of reality is, thus, augmented by the information presented on the display.
  • One issue with augmented reality devices is the manner in which the user interacts with and controls the device.
  • Traditional control devices e.g., a mouse, track ball, or touch screen, are difficult to use with augmented reality devices.
  • Using gesture recognition in an augmented reality system is not trivial, because the user, and thus the augmented reality device, is constantly moving in real time.
  • the invention thus provides a computer program product containing instructions for causing a processor to perform a method comprising:
  • the augmented information may include at least one of information associated with objects in the environment; images associated with the environment; and distances associated with the environment.
  • the correlating may include determining a reference location in three dimensional space of at least a portion of the user's hand, and determining in at least one of the augmented information and the image information data associated with the reference location.
  • the altering may include changing the augmented information as a function of the data associated with the reference location.
  • FIG. 1 shows schematically a system for augmented reality in accordance with one embodiment of the invention
  • FIG. 2 shows a system for augmented reality comprising a set of goggles in accordance with one embodiment of the invention
  • FIG. 3 shows the system of FIG. 2 in use
  • FIG. 4 a shows a view of a real-world scene displayed on a display device of the system of FIG. 2
  • FIG. 4 b shows the view of FIG. 4 a with the user's finger pointing to an object in the view
  • FIG. 4 c shows visual text relating to the object at which the user's finger is pointing overlaid on the view of FIG. 4 b;
  • FIG. 5 shows a system for augmented reality integral with a communication device in accordance with another embodiment of the invention.
  • FIG. 6 a shows Yet designating an area in the field of view of an image sensor by the user performing a gesture of “drawing” the contour of the area
  • FIG. 6 b shows resizing the selected area by performing a second gesture
  • FIG. 6 c shows the area after resizing
  • FIG. 6 d shows the area after being dragged to a new location in the field of view.
  • FIG. 1 shows schematically a system 30 for augmented reality in accordance with one embodiment of the invention.
  • the system 30 includes one or more image sensors 32 configured to obtain images of a real world scene. Any type of image sensor may be used in the system of the invention such as a camera alight sensor, an IR sensor, an ultrasonic sensor, a proximity sensor, a CMOS image sensor, a shortwave infrared (SWIR) image sensor or a reflectivity sensor.
  • a camera alight sensor such as a camera alight sensor, an IR sensor, an ultrasonic sensor, a proximity sensor, a CMOS image sensor, a shortwave infrared (SWIR) image sensor or a reflectivity sensor.
  • SWIR shortwave infrared
  • the system 30 further includes a viewing device 34 having one or more display devices 35 that enable a user to see both the real world scene and external information, such as images, videos, or audio signals, superimposed upon the real world scene.
  • a viewing device 34 having one or more display devices 35 that enable a user to see both the real world scene and external information, such as images, videos, or audio signals, superimposed upon the real world scene.
  • Any type of display device that allows a user to both see the real world scene and the displayed data may be used in the system of the invention.
  • the display devices 35 may comprise, for example, a surface upon which visual material is presented to a user or one or more projectors that display images directly to the user's retina.
  • a processor 36 obtains orientation and/or location data of the system 30 from one or more state sensors 38 , that may be, for example, any one or more of an optical sensor, an accelerometer, GPS, a gyroscope, a solid state compasses, magnetic sensor, gravity sensor, and an RFID detector.
  • the processor 36 may be, for example, a dedicated processor, a general purpose processor, a DSP (digital signaling processor) processor, a GPU (visual processing unit) processor, dedicated hardware, or a processor that can run on an external device.
  • the system 30 may run as a software on the viewing device 34 , or another device 37 , such as Smartphone, that incorporates the other components of the system 30 .
  • the processor 36 is configured to run a gesture detection module 40 that identifies in images of the real world scene obtained by the image sensor 32 one or more real world objects at which a predefined object is pointing.
  • the real world objects may be, for example, a building or a billboard. Determination of the real world objects utilizes data provided by the state sensors 38 .
  • the predefined object may be a user's finger or other object such as a stylus or wand.
  • the processor 36 When the processor 36 has identified a real world object at which the predefined object is pointing, the processor searches in a memory 42 for data associated with the identified object.
  • the data may be, for example, visual data, audio data, or textual data.
  • the visual data may be textual information relating to the identified object.
  • the processor displays the associated visual data associated with the identified object on the display of the viewing device.
  • the memory 42 may be integral with the system 30 or may be remotely located and accessed over a communication network, such as the Internet.
  • the system 30 may thus comprise a communication module 39 allowing the system 30 to communicate with a network, wireless network, cellular network, an external device such as another device 30 , a mobile phone, tablet, or an Internet website and so on.
  • the data may be an activatable icon.
  • activatable icon refers to a region in an image or video associated with one or more messages or commands that are activated by a user interaction.
  • the activatable icons may be, for example, a 2D or 3D visual element such as virtual buttons, a virtual keyboard or icon.
  • Activatable icons are activated by means of one or more predefined objects that are recognizable by the system, and may be, for example, a stylus, one or more of a user's hands or a portion of a hand, one or more fingers or a portion of a finger such as a finger tip.
  • Activation of one or more of the activatable icons by a predefined object results in the generation of a message or a command addressed to an operating system, one or more services, one or more applications, one or more devices, one or more remote applications, one or more remote services, or one or more remote devices.
  • the processor 36 may be configured to send a message or command to the device 37 or to a remote device, to an application running on the device, to a service running on the device 37 , and an operating system running on the device, to a process running on the device, a software program running in the background and one or more services running on the device or, a process running in the device.
  • the message or command may be sent over a communication network such as the Internet or a cellular phone network.
  • the command may be, for example, a command to run an application on the device, a command to stop an application running on the device, a command to activate a service running on the device, a command to stop a service running on the device, or a command to send data to the processor 36 relating to a real world object identified in an image by the processor 36 .
  • the command may be a command to the device 37 such as depressing a virtual key displayed on a display device of the device; rotating a selection carousel; switching between desktops, running on the device a predefined software application; turning off an application on the device; turning speakers on or off; turning volume up or down; locking the device, unlocking the device, skipping to another track in a media player or between IPTV channels; controlling a navigation application; initiating a call, ending a call, presenting a notification, displaying a notification; navigating in a photo or music album gallery, scrolling web-pages, presenting an email, presenting one or more documents or maps, controlling actions in a game, controlling interactive video or animated content, editing video or images, pointing at a map, zooming-in or out on a map or images, painting on an image, pushing an activatable icon away from the display device, grasping an activatable icon and pulling the activatable icon out form the display device, rotating an activatable icon, emulating touch commands on the device,
  • the communication module may be used to transmit a message that may be addressed, for example, to a remote device.
  • the message may be, for example a command to a remote device.
  • the command may be, for example a command to run an application on the remote device, a command to stop an application running on the remote device, a command to activate a service running on the remote device, a command to stop a service running on the remote device.
  • the message may be a command to the remote device selected from depressing a virtual key displayed on a display device of the remote device; rotating a selection carousel; switching between desktops, running on the remote device a predefined software application; turning off an application on the remote device; turning speakers on or off; turning volume up or down; locking the remote device, unlocking the remote device, skipping to another track in a media player or between IPTV channels; controlling a navigation application; initiating a call, ending a call, presenting a notification, displaying a notification; navigating in a photo or music album gallery, scrolling web-pages, presenting an email, presenting one or more documents or maps, controlling actions in a game, pointing at a map, zooming-in or out on a map or images, painting on an image, grasping an activatable icon and pulling the activatable icon out form the display device, rotating an activatable icon, emulating touch commands on the remote device, performing one or more multi-touch commands, a touch gesture command, typing, clicking
  • the message can be request for data associated with the identified object.
  • the data request message may be addressed to an application, a service, a process, a thread running on the device, or from an application, a service, a process, or a thread running on an external device, or from an online service.
  • an object recognition module to detect the predefined object can be employed only when the headset is not moving significantly as determined from information obtained by the state sensors.
  • FIG. 2 shows a system 2 for augmented reality in accordance with one embodiment of the invention.
  • the system 2 comprises a portable viewing device that may be for example, an interactive head-mounted eyepiece such as a pair of eyeglasses or goggles 4 .
  • the goggles 4 are provided with a image sensor 6 that obtains images of a real-world scene 8 .
  • the scene 8 may include, for example, one or more buildings 12 , or one or more billboards 14 .
  • the goggles may be provided with one or more display devices 10 that are located in the goggles 4 so as to be positioned in front of a user's eyes when the goggles 4 are worn by the user.
  • the display devices 10 may be, for example, see-through devices such as transparent LCD screens through which the real world scene is viewed, together with presenting external data.
  • the system 2 further comprises a processor 16 that is configured to identify in images captured by the image sensors 6 , a predefined object performs a gesture or pointing at a real world object in the real world scene 8 or activatable icons displayed to the user.
  • the system 2 also includes one or more location and/or orientation sensors 23 such as GPS, an accelerometer, a gyroscope, a solid state compasses, magnetic sensor, or a gravity sensor.
  • FIG. 5 shows a system 40 for augmented reality in accordance with another embodiment of the invention.
  • the system 40 is integrated into a mobile communication device 42 such as a mobile phone, tablet, or camera.
  • a front view of the communication device 42 is shown in FIG. 5 a
  • a rear view of the communication device 42 is shown in FIG. 5 b .
  • the communication device 42 is provided with an image sensors 46 on its rear surface, opposite to the display device, that obtains images of a real-world scene.
  • the communication device 42 is also provided with a display device 48 on its front surface that is positioned in front of a user when the camera 46 is directed towards a real world scene.
  • the display device 48 may be for example, a LCD screen that presents to the user images of a real world scene obtained by the camera 6 , together with visual data, as explained below.
  • the system 40 utilizes the camera 46 , the display device 48 , and the processor of the communication device 42 , and further comprises one or more state sensors, contained within the housing of the communication device 42 which are not seen in FIG. 5 .
  • the processor is configured to identify in images captured by the image sensors 46 a predefined object pointing at a real world object in the real world scene.
  • FIG. 3 a shows the system 2 in use.
  • the goggles 4 are placed over the eyes of a user 18 .
  • the user faces the real world scene 8 and thus views the scene 8 .
  • FIG. 3 b shows the system 40 in use.
  • the user 18 holds the communication device 42 with the image sensors 46 facing the real world scene 8 and the display device 48 facing the user.
  • the system 2 or 40 now executes the following process.
  • the view of the scene 8 that the user would see when using the system 2 or 40 is displayed on the display device.
  • FIG. 4 a shows the view of the scene 8 that the user would see when using the system 2 or 40 to view the real world scene 8 .
  • the processor 36 analyzes images obtained by the image sensors to determine when a predefined object in images captured by the image sensors is performing a predefined gesture in relation to a real world object in the real world scene 8 .
  • the viewing device 34 such as the goggles 4 or the communication device 42 is often not stable in use, due to movement of the user as occurs during walking, or movement of the user's head or hand. In this situation, the signal generated by the sensors 38 may be noisy and inaccurate. In this case, the machine vision module 37 runs a tracking algorithm that tracks the identified real world object and maintains the displayed associated visual data in a fixed position relative to the identified real world object.
  • the predefined gesture relating to a real world object or to an activatable icon may be, for example, pointing at the real world object or an activatable icon, or performing a swiping gesture over the real world object or an activatable icon.
  • the activatable icon may or may not be correlated to a real world object.
  • Other possible predefined gestures include a swiping motion, a pinching motion of two fingers such as with the fore finger and thumb or the middle finger and thumb, pointing, a left to right gesture, a right to left gesture, an upwards gesture, a downwards gesture, a pushing gesture, opening a clenched fist, opening a clenched first and moving towards the image sensor, a tapping gesture, a waving gesture, a clapping gesture, a reverse clapping gesture, closing a hand into a fist, a pinching gesture, a reverse pinching gesture, a gesture of splaying fingers on a hand, a reverse gesture of splaying fingers on a hand, pointing at an activatable icon, or at a real world object, pointing at an activatable icon or a real world object for a predefined amount of time, clicking on an activatable icon or real world object, double clicking on an activatable icon or real world object, clicking with a forefinger on an activ
  • the predefined object may be, for example, a user hand, a part of a user's hand, such as the user's finger 20 or parts of two different hands.
  • the predefined object may be a stylus or wand.
  • the feedback may be a visual indication in a form selected from an activatable icon displayed on a display device, a change in an activatable icon on a display device, a change in color of an activatable icon on a display device, a change in size of an activatable icon, animation of an activatable icon, an indication light, an indicator moving on a display device, a vibration, a directional vibration indication, an air tactile indication.
  • the indication may be provided by an indicator moving on a display device that appears on top of all other images or video appearing on the display device.
  • Visual feedback may be the appearance of a glow around the predefined object when a system recognizes the predefined object.
  • the gesture detection module 40 may use any method for detecting the predefined objects in images obtained by the image sensor 32 .
  • the gesture detection module may detect the predefined object as disclosed in WO2005/091125 or WO 2010/086866.
  • the processor 16 is further configured to determine the real world object in the scene 8 towards which the predefined gesture was performed. Thus, for example, in the image shown in FIG. 4 b , the processor 16 would determine that the user's finger 20 is pointing at the billboard 14 by determining the fingertip location (X,Y) in the image and combining this information with the location of the user and the orientation of the goggles 4 from the state sensors 21 .
  • the real world object is thus indentified by the processor without presenting to the user a cursor or other marker to indicate the real world object that the user wishes to select, enabling a direct pointing on a real world object to start an interaction.
  • the processor 16 searches in a memory, which may be integral with the processor 16 or may be remotely located, for data relating to the real-world object to which the user's finger 20 is pointing.
  • the memory may have stored data relating to the billboard 14 .
  • the data is displayed on the display device 10 superimposed on the user's view of the scene.
  • visual data 21 relating to the billboard 14 is displayed on the display device 10 , as shown in FIG. 4 c.
  • the visual data 21 may be static or animated.
  • the visual data 21 may include one or more an activatable icons, such that when a predefined gesture is performed relative to one of the activatable icons, a command associated with the activatable icon is executed.
  • the command may be, for example, to display specific visual material relating to the selected real world object.
  • the activatable icons may be 2D or 3D activatable icons and may be presented to the user so that the user perceives the icon in front of him in a 3D space.
  • an activatable icon is a region in a 2D or 3D image or video associated with one or more messages activated by user interaction.
  • the activatable icons may be, for example, a 2D or 3D visual element.
  • the activatable icons may be virtual buttons, a virtual keyboard, a 2D or 3D activatable icon, a region in an image or a video.
  • An activatable icon may consist of two or more activatable icons.
  • the processor may not present part of the activatable icon where the predefined object is located, so that the predefined object appears to be on top of the activatable icon.
  • the activatable icons may be removed when the user rapidly moves his head and then returned when the head motion is below a predefined motion speed.
  • the system 2 may have two or more operational modes and the processor 16 may be configured to identify one or more predefined gestures to change between the operational modes.
  • a gesture may be used to turn the system on or off, select the source of the visual material to be presented, select the level of details of the visual material to be presented, select the buttons or activatable icons to be presented to the user, or activate an online service, such as an online service related to a selected real world object.
  • Yet another mode of operation may be to start video recording of images by the image sensor and/or recording of sounds by a microphone upon identification of a predefined gesture and to stop recording upon identification of another predefined gesture.
  • Yet another mode of operation is continuously monitoring video and/or sound, but following a detection of a predefined gesture, the video/sound is recorded starting from a predetermined amount of time prior to identification of the gesture, and stopping the recording after identification of another predefined gesture.
  • the predetermined time may be defined by the user.
  • Yet another mode of operation is adding tags in a captured and real-time recorded video upon identification of a predefined gesture.
  • FIG. 6 Yet another mode of operation is shown in FIG. 6 .
  • an area 62 in the field of view 60 as captured by the image sensor is designated by the user performing a gesture of “drawing” the contour of the area, shown by phantom lines in FIG. 6 .
  • the selected area is then resized by the user performing a second gesture, such as separating two fingers or bringing two fingers closer together as indicated by the arrows 66 in FIG. 6 b , until the selected area attains the desired size ( 67 in FIG. 6 c ).
  • the area 67 is then dragged to a new location in the field of view ( FIG. 6 d ) and copied in the new location in the field of view.
  • the system then employs a tracker on the selected area and the selected area is presented in real-time in the resized and relocated area set by the user on the display device.
  • a region of images containing a displayed activatable icon bounding box around a displayed activatable icon may be defined that remains fixed.
  • the system employs a machine vision tracker to track this bounding box.
  • the distance between the locations of the bounding boxes in two frames of a video sequence is less than a predefined distance, as determined using a video tracker, and the correlation value of the tracker of the bounding box is below a predefined value.
  • CPU can be minimized by searching for the predefined object only in the vicinity of each displayed activatable icon.
  • the objection recognition module is not activated all the time but only when the headset is not moving significantly as determined from information obtained by a state sensors.
  • a user may choose different filters to screen data correlated with real-world objects, such as a filter “display data generated only by friends”, or display data from registered sources, or data generated in the last three months.
  • the system 2 may have a stand-by mode in which the power consumption by the system 2 is minimal.
  • the active mode may be different from the stand-by mode, for example, in the number of video frames per second that are being analyzed by the system, the resolution of images that are being analyzed, the portion of the image frame that is being analyzed, and/or the detection modules that are activated.
  • the system 2 can be brought to the active mode by any technique.
  • the system 2 may be brought to the active mode by bringing the predefined object into the field of view from below, when user place the predefined object in certain location or posse, such as pointing on the bottom right corner of the camera field of view or open his hand in the camera field of view, perform a predefined gesture such as moving the hand from right to left across the field of view, when an activatable icon is displayed and the user performs a predefined gesture correlate to the activatable icon such as pointing on the activatable icon, or perform a waving gesture in the location where the activatable icon is presented, or sliding the floating activatable icon from one location to the other by perform a gesture in the 3D space where the activatable icon is perceived to be located, by touching the device, or tapping on the device if the device is provided with an accelerometer.
  • a predefined gesture such as moving the hand from right to left across the field of view
  • an activatable icon is displayed and the user performs a predefined gesture correlate to the activatable icon such as pointing on the activ
  • the system may enter the active mode when the user passes a hand near the device if the device is provided with a proximity sensor, or ultrasonic sensor.
  • the system may also be activated by a voice command, or when the user places the predefined object in a particular location in the field of view.
  • the system may enter the active mode only when there is relevant data associated with the real world in the field of view of the user. When the system may indicate to the user when there is a relevant data to be presented, or when it is ready for interaction.
  • a visual indication may be attached to a real-world object to let the user know that there is data correlated with the real-world object.
  • Indication of relevant data may be overlaid on the location of the real-world object as a small visual indication such as an activatable icon of “i” may indicate information, and a logo of “photos” may indicate images related to the real-world object, or a logo of “envelop” indicates a message that was left by a friend or other user correlated to the real-world object.
  • a small visual indication such as an activatable icon of “i” may indicate information
  • a logo of “photos” may indicate images related to the real-world object
  • a logo of “envelop” indicates a message that was left by a friend or other user correlated to the real-world object.
  • the system 2 may be configured to undergo a calibration process to record various physical parameters of the predefined object so as to facilitate identification of the predefined object in images obtained by the camera by the processor 2 . This may be done, for example, by presenting to the user on the display activatable icons in different locations in the 3D space, and extracting physical features of the predefined object such as its size or orientation of the predefined object, and determining a correlation between the dimensions of the predefined object and its distance from the camera.
  • the calibration may involve calculating the triangular of camera, the user's line of sight and the tip of the predefined object to determine the user is pointing at. The accuracy is improved by estimating the distance of the real world object from the camera based on information extracted in the calibration.
  • the processor may be configured to identify in images obtained by the camera of the real world scene by another user of the system of the invention.
  • the identification of another user in the real world scene may be performed, for example, by informing a remote server of the locations of the devices in a particular geographical area.
  • the locations of the other devices can be sent to all of the devices in the geographical area.
  • the two systems may be used for game playing.
  • the other user may be represented to as an avatar with whom the user can interact by gestures such as send a message to the other user such as “like”.
  • the processor may be configured to display a keyboard that enables text typing with one or more fingers or hands.
  • Display of the keyboard may be initiated upon detection of a predefined gesture such as a gesture from right to left, or by the using presenting an open hand, or two open hands in a predefined region of the field of view of the camera, such as the bottom part of the field of view.
  • a predefined gesture such as a gesture from right to left, or by the using presenting an open hand, or two open hands in a predefined region of the field of view of the camera, such as the bottom part of the field of view.
  • Yet another way to initiate the display of the keyboard is when the user performs a click gesture in the 3D space where the typing area or an activatable icon is perceived to be located.
  • the keyboard may be used, for example, in order to, write a note, conduct a search or to communicate with online services (such as Skype or twitter) by typing on virtual keyboard.
  • the system may not present part of the keyboard where the predefined object is located, so that
  • an animated hand When the system is in a typing mode, an animated hand may be presented on the keyboard whose position is correlated with the user's hands and fingers. The fingertips of the animated hands may be located above a virtual keystroke at the location where the character of the keystroke is seen.
  • the keyboard and the animated hands are preferably opaque, so that the user is unable see the background behind the keyboard. This tends to make the keyboard clearer to the user.

Abstract

A method and system for augmented reality. Images of a real world scene are obtained from one or more from one or more image sensors. An orientation and/or location of the image sensors is obtained from one or more state sensors. A real world object at which a predefined pointing object is performing a predefined gesture in the images of the real world scene is identified and data associated with the identified object is displayed on a viewing device. The invention also provides a computer program comprising computer program code means for performing all the steps of the method of the invention when said program is run on a computer.

Description

    TECHNOLOGICAL FIELD
  • The present invention relates to methods and systems for augmented reality.
  • PRIOR ART
  • References considered to be relevant as background to the presently disclosed subject matter are listed below:
      • U.S. Pat. No. 7,126,558;
      • US Published Patent Application 20110221669;
      • US Published Patent Application 20110270522;
      • GB2465280(A);
      • US Published Patent Application 20120068913;
      • U.S. Pat. No. 7,215,322;
      • WO2005/091125;
      • WO 2010/086866
      • Crowley, J. L. et al, Finger Tracking as an Input Device for Augmented Reality. Published in the proceedings of the International Workshop on Face and Gesture Recognition, Zurich, Switzerland, June 1995.
  • Acknowledgement of the above references herein is not to be inferred as meaning that these are in any way relevant to the patentability of the presently disclosed subject matter.
  • BACKGROUND
  • Augmented reality is a term for a live, direct or an indirect, view of a physical, real-world environment whose elements are augmented by computer-generated information such as text, sound, video, graphics or GPS data. Artificial information about the environment and its objects is thus overlaid on a real world view or image. Augmentation is typically in real-time and in semantic context with environmental elements so that information about the surrounding real world of the user becomes interactive and digitally manipulatable.
  • The main hardware components for augmented reality are a processor, display, sensors and input devices. These elements, specifically a CPU, display, camera and MEMS sensors such as accelerometer, GPS, or solid state compass are present in portable device such as smartphones, which allow them to function as augmented reality platforms.
  • Augmented reality systems have found applications in entrainment, navigation, assembly processes, maintenance, medical procedures. Portable augmented reality systems have also found applications in tourism and sightseeing where augmented reality is used to present information of real world objects and places objects being viewed.
  • An immersive augmented reality experience is provided using a head-mounted display, typically in the form of goggles or a helmet. With a head-mounted display, virtual visual objects are superimposed on the user's view of a real world scene. The head mounted display is tracked with sensors that allow the system to align virtual information with the physical world. The tracking may be performed, for example, using any one or more of such technologies as digital cameras or other optical sensors, accelerometers, GPS, gyroscopes, solid state compasses, RFID and wireless sensors. Head-mounted displays are either optical see-through or video see-through. Optical see-through employs solutions such as half-silver mirrors to pass images through the lens and overlay information to be reflected into the user's eyes, and transparent LCD projectors that display the digital information and images directly or indirectly to the user retina.
  • General Description
  • The present invention provides an interactive system for augmented reality. The interactive system of the invention includes a wearable data display device that may be incorporated for example, into a pair of glasses or goggles. The wearable display has a device providing location extraction capabilities (such as GPS) and a compass. The system also includes a user interface that allows a user to select computer generated data to augment a real world scene that the user is viewing. A camera obtains images of the real-world scene being viewed. A processor detects a predefined object in images of the real world scene captured by the camera such as a user's finger. When the user points to an element in the scene, data relating to the element are displayed on the data display device and are superimposed on the user's view of the scene.
  • Thus, in one of its aspects, the invention provides a method for augmented reality comprising:
      • (a) obtaining images of a real world scene from one or more image sensors;
      • (b) obtaining from one or more state sensors one or both of an orientation and a location data of the image sensors;
      • (c) identifying in the images of the real world scene obtained by the image sensor or sensors a real world object at which a predefined pointing object is performing a predefined gesture, the gesture detection module utilizing data provided by the one or more state sensors; and
      • (d) presenting data associated with the identified object on a display of a viewing device.
  • The image sensor may be selected from a camera a light sensor, an IR sensor, an ultrasonic sensor, a proximity sensor, a CMOS image sensor, a shortwave infrared (SWIR) image sensor or a reflectivity sensor, an IR sensor, an ultrasonic sensor, a proximity sensor, and a reflectivity sensor. One or more of the state sensors may be selected from an optical sensor, an accelerometer, GPS, a gyroscope, a compass, magnetic sensor, a sensor indicating the direction of the device relative to the Earth's magnetic field, a gravity sensor and an RFID detector.
  • The data associated with the identified object may be obtained by searching in a memory for data associated with the real world object.
  • The predefined object may be, for example, a hand, a part of a hand, two hands, parts of two hands, a finger, part of a finger, or a finger tip.
  • The viewing device may be configured to be worn by a user, for example, glasses or goggles. The viewing device may be incorporated in a mobile communication device.
  • The step of identifying in the images of the real world scene obtained by the image sensor or sensors may comprise determining a location (X,Y) of the predefined object in an image obtained by the image sensors and determining one or both of location and an orientation of the display device provided by the sensors.
  • The method of the invention may further comprise communicating with an external device or website. The communication may comprise sending a message to an application running on the external device, a service running on the external device, an operating system running on the external device, a process running on the external device, one or more applications running on a processor of the external device, a software program running in the background of the external device, or to one or more services running on the external device. The method may further comprise sending a message to an application running on the mobile communication device, a service running on the mobile communication device, an operating system running on the mobile communication device, a process running on the mobile communication device, one or more applications running on a processor of the mobile communication device, a software program running in the background of the mobile communication device, or to one or more services running on the mobile communication device.
  • The method may further comprise sending a message requesting a data relating to a real world object identified in an image from an application running on the external device, a service running on the external device, an operating system running on the external device, a process running on the external device, one or more applications running on a processor of the external device, a software program running in the background of the external device, or to one or more services running on the external device. The method may further comprise sending a message requesting a data relating to a real world object identified in an image from an application running on the mobile communication device, a service running on the mobile communication device, an operating system running on the mobile communication device, a process running on the mobile communication device, one or more applications running on a processor of the mobile communication device, a software program running in the background of the mobile communication device, or to one or more services running on the mobile communication device.
  • The message to the external device or website may be a command. The command may be selected from a command to run an application on the external device or website, a command to stop an application running on the external device or website, a command to activate a service running on the external device or website, a command to stop a service running on the external device or website, or a command to send data relating to a real world object identified in an image.
  • The message to the mobile communication device may be a command. The command may be selected from a command to run an application on the mobile communication device, a command to stop an application running on the mobile communication device or website, a command to activate a service running on the mobile communication device e, a command to stop a service running on the mobile communication device, or a command to send data relating to a real world object identified in an image.
  • The method may further comprise receiving from the external device or website data relating to a real world object identified in an image and presenting the received data to a user.
  • The communication with the external device or website may be over a communication network.
  • The command to the external device may be selected from depressing a virtual key displayed on a display device of the external device; rotating a selection carousel; switching between desktops, running on the external device a predefined software application; turning off an application on the external device; turning speakers on or off; turning volume up or down; locking the external device, unlocking the external device, skipping to another track in a media player or between IPTV channels; controlling a navigation application; initiating a call, ending a call, presenting a notification, displaying a notification; navigating in a photo or music album gallery, scrolling web-pages, presenting an email, presenting one or more documents or maps, controlling actions in a game, pointing at a map, zooming-in or out on a map or images, painting on an image, grasping an activatable icon and pulling the activatable icon out form the display device, rotating an activatable icon, emulating touch commands on the external device, performing one or more multi-touch commands, a touch gesture command, typing, clicking on a displayed video to pause or play, tagging a frame or capturing a frame from the video, presenting an incoming message; answering an incoming call, silencing or rejecting an incoming call, opening an incoming reminder; presenting a notification received from a network community service; presenting a notification generated by the external device, opening a predefined application, changing the external device from a locked mode and opening a recent call application, changing the external device from a locked mode and opening an online service application or browser, changing the external device from a locked mode and opening an email application, changing the external device from locked mode and opening an online service application or browser, changing the device from a locked mode and opening a calendar application, changing the device from a locked mode and opening a reminder application, changing the device from a locked mode and opening a predefined application set by a user, set by a manufacturer of the external device, or set by a service operator, activating an activatable icon, selecting a menu item, moving a pointer on a display, manipulating a touch free mouse, an activatable icon on a display, altering information on a display.
  • In the method of the invention, the predefined gesture may be selected from a swiping motion, a pinching motion of two fingers, pointing, a left to right gesture, a right to left gesture, an upwards gesture, a downwards gesture, a pushing gesture, opening a clenched fist, opening a clenched first and moving towards the image sensor, a tapping gesture, a waving gesture, a clapping gesture, a reverse clapping gesture, closing a hand into a fist, a pinching gesture, a reverse pinching gesture, a gesture of splaying fingers on a hand, a reverse gesture of splaying fingers on a hand, pointing at an activatable icon, holding an activating object for a predefined amount of time, clicking on an activatable icon, double clicking on an activatable icon, clicking from the right side on an activatable icon, clicking from the left side on an activatable icon, clicking from the bottom on an activatable icon, clicking from the top on an activatable icon, grasping an activatable icon the object, gesturing towards an activatable icon the object from the right, gesturing towards an activatable icon from the left, passing through an activatable icon from the left, pushing the object, clapping, waving over an activatable icon, performing a blast gesture, performing a tapping gesture, performing a clockwise or counter clockwise gesture over an activatable icon, sliding an icon, grasping an activatable icon with two fingers, and performing a click-drag-release motion.
  • The data associated with the identified object may be any one or more of visual data, audio data, or textual data. The data associated with the identified object may be an activatable icon. The activatable icon may be a 2D or 3D activatable icon. The activatable icon may be perceived by a user in a 3D space in front of the user.
  • The method of the invention may have two or more operational modes. The method may change the operational mode of the system upon identification of a predefined gesture. An operational mode may be specified by any one or more of the gestures to be identified, algorithms that are active on the gesture detection module; a resolution of images captured by the image sensor, and a capture rate of images captured by the image sensor, the level of details of the data to be presented, the activatable icons to be presented to the user, a source of the data to be presented, a level of details of the data to be presented, activatable icons to be displayed on the display device, an active on-line service.
  • The operational mode may be a mode selected from a mode of video recording of images by the image sensor upon identification of a predefined gesture; a mode of recording sounds by a microphone upon identification of a predefined gesture and to stop recording upon identification of another predefined gesture; a mode of continuously monitoring video or sound and following a detection of a predefined gesture, recording the video or sound starting from a predefined amount of time prior to identification of the gesture, and stopping the recording after identification of another predefined gesture, a mode of adding tags in a captured and real-time recorded video upon identification of a predefined gesture am mode of selecting an area in the field of view as captured by the camera, and copying the area to another location in the field of view and resizing it, a mode employing a tracker on a selected area in an image and is presenting the selected area in real-time in the resized and relocated area on the display device, a mode of capturing an image upon identification of a predefined gesture.
  • The method of the invention may further comprise running a tracking algorithm that tracks the identified real world object and maintains the displayed associated visual data in a fixed position relative to the identified real world object.
  • An object recognition module may be employed to detect the predefined object only when the display device has level of motion below a predetermined threshold.
  • The method may further comprise providing feedback when a predefined gesture has been identified. The feedback may be, for example, visual feedback, audio feedback, haptic feedback, directional vibration, air tactile feedback, or ultrasonic feedback. The feedback may be a visual indication in a form selected from an activatable icon displayed on the display device, a change in an activatable icon displayed on the display device, a change in color of an activatable icon displayed on the display device, a change in size of an activatable icon displayed on the display device, animation of an activatable icon displayed on the display device, an indication light, an indicator moving on a display device, an indicator moving on the display device that appears on top of all other images or video appearing on the display device and the appearance of a glow around the predefined object. The feedback may be a vibration, a directional vibration indication, or an air tactile indication.
  • In the method of the invention, part of an activatable icon displayed on the display device may not presented where the predefined object is located, so that the predefined object appears to be on top of the activatable icon.
  • Activatable icons may be removed from the display device when the display device has a level of activity above a predefined threshold. The removed icons on the display device may be removed, for example, when the display device has a level of motion below the predefined threshold.
  • The method may be brought into an active mode when a predefined action is performed. The predefined action may be selected from bringing the predefined object into the field of view from below, when user place the predefined object in certain location or posse, such as pointing on the bottom right corner of the camera field of view or open his hand in the camera field of view, perform a predefined gesture such as moving the hand from right to left across the field of view, when an activatable icon is displayed and the user performs a predefined gesture correlate to the activatable icon such as pointing on the activatable icon, or perform a waving gesture in the location where the activatable icon is presented, or sliding the floating activatable icon from one location to the other by perform a gesture in the 3D space where the activatable icon is perceived to be located, by touching the device, or tapping on the device if the device is provided with an accelerometer. As yet another example, the system may enter the active mode when the user passes a hand near the device if the device is provided with a proximity sensor, or ultrasonic sensor. The system may also be activated by a voice command, or when the user places the predefined object in a particular location in the field of view. As yet another example, the system may enter the active mode only when there is relevant data associated with the real world in the field of view of the user. When the system may indicate to the user when there is a relevant data to be presented, or when it is ready for interaction.
  • The method of the invention may further comprise attaching a visual indication to a real-world object indicating the existence in a memory of data correlated with the real-world object. The visual indication may be overlaid on an image of the real-world object. The visual may be selected from an activatable icon, a photo, and an image of an envelope.
  • The method of the invention may further comprise a calibration process to record one or more physical parameters of the predefined object. The calibration process may comprise any one or more steps selected from presenting on the display activatable icons in different locations in a 3D space, extracting physical features of the predefined object, and determining a correlation between dimensions of the predefined object and its distance from the camera. The calibration process may comprise a step of constructing a triangle having vertices at one of the image sensors and at a tip of the predefined object and having a side formed by a user's line of sight. The distance of the real world object from the camera may be estimated based on information extracted in the calibration.
  • The method may further comprise displaying a keyboard enabling text typing. The keyboard may be displayed upon detection of a predefined gesture, such as a gesture from right to left, presenting an open hand, presenting two open hands in a predefined region of the field of view of an image sensor. The keyboard may be displayed upon performing a click gesture in a 3D typing area or where a predefined activatable icon is perceived to be located.
  • The invention also provides a system comprising a device configured to execute the method of the invention.
  • The invention also provides a computer program comprising computer program code means for performing all the steps of the method of the invention when said program is run on a computer. The computer program may be embodied on a computer readable medium.
  • A user may interact with a visual image typically displayed through glasses. The user's view of reality is, thus, augmented by the information presented on the display. One issue with augmented reality devices is the manner in which the user interacts with and controls the device. Traditional control devices, e.g., a mouse, track ball, or touch screen, are difficult to use with augmented reality devices. Using gesture recognition in an augmented reality system is not trivial, because the user, and thus the augmented reality device, is constantly moving in real time.
  • The invention thus provides a computer program product containing instructions for causing a processor to perform a method comprising:
  • receiving, from an image sensor associated with an augmented reality
  • device, image information associated with an environment;
  • displaying, on a display associated with the device, augmented information related to the environment;
  • recognizing, in the image information, a hand gesture by a user of the device;
  • correlating the hand gesture with the augmented information; and
  • altering the displayed augmented information based on the correlating.
  • The augmented information may include at least one of information associated with objects in the environment; images associated with the environment; and distances associated with the environment.
  • The correlating may include determining a reference location in three dimensional space of at least a portion of the user's hand, and determining in at least one of the augmented information and the image information data associated with the reference location.
  • The altering may include changing the augmented information as a function of the data associated with the reference location.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • In order to understand the invention and to see how it may be carried out in practice, embodiments will now be described, by way of non-limiting example only, with reference to the accompanying drawings, in which:
  • FIG. 1 shows schematically a system for augmented reality in accordance with one embodiment of the invention;
  • FIG. 2 shows a system for augmented reality comprising a set of goggles in accordance with one embodiment of the invention;
  • FIG. 3 shows the system of FIG. 2 in use;
  • FIG. 4 a shows a view of a real-world scene displayed on a display device of the system of FIG. 2, FIG. 4 b shows the view of FIG. 4 a with the user's finger pointing to an object in the view, and FIG. 4 c shows visual text relating to the object at which the user's finger is pointing overlaid on the view of FIG. 4 b;
  • FIG. 5 shows a system for augmented reality integral with a communication device in accordance with another embodiment of the invention; and
  • FIG. 6 a shows Yet designating an area in the field of view of an image sensor by the user performing a gesture of “drawing” the contour of the area, FIG. 6 b shows resizing the selected area by performing a second gesture, FIG. 6 c shows the area after resizing, and FIG. 6 d shows the area after being dragged to a new location in the field of view.
  • DETAILED DESCRIPTION OF EMBODIMENTS
  • FIG. 1 shows schematically a system 30 for augmented reality in accordance with one embodiment of the invention. The system 30 includes one or more image sensors 32 configured to obtain images of a real world scene. Any type of image sensor may be used in the system of the invention such as a camera alight sensor, an IR sensor, an ultrasonic sensor, a proximity sensor, a CMOS image sensor, a shortwave infrared (SWIR) image sensor or a reflectivity sensor.
  • The system 30 further includes a viewing device 34 having one or more display devices 35 that enable a user to see both the real world scene and external information, such as images, videos, or audio signals, superimposed upon the real world scene. Any type of display device that allows a user to both see the real world scene and the displayed data may be used in the system of the invention.
  • The display devices 35 may comprise, for example, a surface upon which visual material is presented to a user or one or more projectors that display images directly to the user's retina. A processor 36 obtains orientation and/or location data of the system 30 from one or more state sensors 38, that may be, for example, any one or more of an optical sensor, an accelerometer, GPS, a gyroscope, a solid state compasses, magnetic sensor, gravity sensor, and an RFID detector. The processor 36 may be, for example, a dedicated processor, a general purpose processor, a DSP (digital signaling processor) processor, a GPU (visual processing unit) processor, dedicated hardware, or a processor that can run on an external device. The system 30 may run as a software on the viewing device 34, or another device 37, such as Smartphone, that incorporates the other components of the system 30.
  • The processor 36 is configured to run a gesture detection module 40 that identifies in images of the real world scene obtained by the image sensor 32 one or more real world objects at which a predefined object is pointing. The real world objects may be, for example, a building or a billboard. Determination of the real world objects utilizes data provided by the state sensors 38. The predefined object may be a user's finger or other object such as a stylus or wand.
  • When the processor 36 has identified a real world object at which the predefined object is pointing, the processor searches in a memory 42 for data associated with the identified object. The data may be, for example, visual data, audio data, or textual data. The visual data may be textual information relating to the identified object. The processor then displays the associated visual data associated with the identified object on the display of the viewing device. The memory 42 may be integral with the system 30 or may be remotely located and accessed over a communication network, such as the Internet. The system 30 may thus comprise a communication module 39 allowing the system 30 to communicate with a network, wireless network, cellular network, an external device such as another device 30, a mobile phone, tablet, or an Internet website and so on.
  • The data may be an activatable icon. As used herein, the term “activatable icon” refers to a region in an image or video associated with one or more messages or commands that are activated by a user interaction. The activatable icons may be, for example, a 2D or 3D visual element such as virtual buttons, a virtual keyboard or icon. Activatable icons are activated by means of one or more predefined objects that are recognizable by the system, and may be, for example, a stylus, one or more of a user's hands or a portion of a hand, one or more fingers or a portion of a finger such as a finger tip. Activation of one or more of the activatable icons by a predefined object results in the generation of a message or a command addressed to an operating system, one or more services, one or more applications, one or more devices, one or more remote applications, one or more remote services, or one or more remote devices.
  • The processor 36 may be configured to send a message or command to the device 37 or to a remote device, to an application running on the device, to a service running on the device 37, and an operating system running on the device, to a process running on the device, a software program running in the background and one or more services running on the device or, a process running in the device. The message or command may be sent over a communication network such as the Internet or a cellular phone network. The command may be, for example, a command to run an application on the device, a command to stop an application running on the device, a command to activate a service running on the device, a command to stop a service running on the device, or a command to send data to the processor 36 relating to a real world object identified in an image by the processor 36.
  • The command may be a command to the device 37 such as depressing a virtual key displayed on a display device of the device; rotating a selection carousel; switching between desktops, running on the device a predefined software application; turning off an application on the device; turning speakers on or off; turning volume up or down; locking the device, unlocking the device, skipping to another track in a media player or between IPTV channels; controlling a navigation application; initiating a call, ending a call, presenting a notification, displaying a notification; navigating in a photo or music album gallery, scrolling web-pages, presenting an email, presenting one or more documents or maps, controlling actions in a game, controlling interactive video or animated content, editing video or images, pointing at a map, zooming-in or out on a map or images, painting on an image, pushing an activatable icon away from the display device, grasping an activatable icon and pulling the activatable icon out form the display device, rotating an activatable icon, emulating touch commands on the device, performing one or more multi-touch commands, a touch gesture command, typing, clicking on a displayed video to pause or play, editing video or music commands, tagging a frame or capturing a frame from the video, cutting a subset of a video from a video, presenting an incoming message; answering an incoming call, silencing or rejecting an incoming call, opening an incoming reminder; presenting a notification received from a network community service; presenting a notification generated by the device, changing the device from a locked mode and activating a recent call application, changing the device from a locked mode and activating an online service application or browser, changing the device from a locked mode and activating an email application, changing the device from locked mode and activating an online service application or browser, changing the device from a locked mode and activating a calendar application, changing the device from a locked mode and activating a reminder application, changing the device from a locked mode and activating a predefined application set by a user, set by a manufacturer of the device, or set by a service operator, activating an activatable icon, selecting a menu item, moving a pointer on a display, manipulating a touch free mouse, activating an activatable icon on a display, and altering information on a display.
  • The communication module may be used to transmit a message that may be addressed, for example, to a remote device. The message may be, for example a command to a remote device. The command may be, for example a command to run an application on the remote device, a command to stop an application running on the remote device, a command to activate a service running on the remote device, a command to stop a service running on the remote device. The message may be a command to the remote device selected from depressing a virtual key displayed on a display device of the remote device; rotating a selection carousel; switching between desktops, running on the remote device a predefined software application; turning off an application on the remote device; turning speakers on or off; turning volume up or down; locking the remote device, unlocking the remote device, skipping to another track in a media player or between IPTV channels; controlling a navigation application; initiating a call, ending a call, presenting a notification, displaying a notification; navigating in a photo or music album gallery, scrolling web-pages, presenting an email, presenting one or more documents or maps, controlling actions in a game, pointing at a map, zooming-in or out on a map or images, painting on an image, grasping an activatable icon and pulling the activatable icon out form the display device, rotating an activatable icon, emulating touch commands on the remote device, performing one or more multi-touch commands, a touch gesture command, typing, clicking on a displayed video to pause or play, tagging a frame or capturing a frame from the video, presenting an incoming message; answering an incoming call, silencing or rejecting an incoming call, opening an incoming reminder; presenting a notification received from a network community service; presenting a notification generated by the remote device, opening a predefined application, changing the remote device from a locked mode and opening a recent call application, changing the remote device from a locked mode and opening an online service application or browser, changing the remote device from a locked mode and opening an email application, changing the remote device from locked mode and opening an online service application or browser, changing the device from a locked mode and opening a calendar application, changing the device from a locked mode and opening a reminder application, changing the device from a locked mode and opening a predefined application set by a user, set by a manufacturer of the remote device, or set by a service operator, activating an activatable icon, selecting a menu item, moving a pointer on a display, manipulating a touch free mouse, an activatable icon on a display, altering information on a display.
  • The message can be request for data associated with the identified object. The data request message may be addressed to an application, a service, a process, a thread running on the device, or from an application, a service, a process, or a thread running on an external device, or from an online service.
  • In order to reduce CPU resources, an object recognition module to detect the predefined object can be employed only when the headset is not moving significantly as determined from information obtained by the state sensors.
  • FIG. 2 shows a system 2 for augmented reality in accordance with one embodiment of the invention. The system 2 comprises a portable viewing device that may be for example, an interactive head-mounted eyepiece such as a pair of eyeglasses or goggles 4. The goggles 4 are provided with a image sensor 6 that obtains images of a real-world scene 8. The scene 8 may include, for example, one or more buildings 12, or one or more billboards 14. The goggles may be provided with one or more display devices 10 that are located in the goggles 4 so as to be positioned in front of a user's eyes when the goggles 4 are worn by the user. The display devices 10 may be, for example, see-through devices such as transparent LCD screens through which the real world scene is viewed, together with presenting external data. The system 2 further comprises a processor 16 that is configured to identify in images captured by the image sensors 6, a predefined object performs a gesture or pointing at a real world object in the real world scene 8 or activatable icons displayed to the user. The system 2 also includes one or more location and/or orientation sensors 23 such as GPS, an accelerometer, a gyroscope, a solid state compasses, magnetic sensor, or a gravity sensor.
  • FIG. 5 shows a system 40 for augmented reality in accordance with another embodiment of the invention. The system 40 is integrated into a mobile communication device 42 such as a mobile phone, tablet, or camera. A front view of the communication device 42 is shown in FIG. 5 a, and a rear view of the communication device 42 is shown in FIG. 5 b. The communication device 42 is provided with an image sensors 46 on its rear surface, opposite to the display device, that obtains images of a real-world scene. The communication device 42 is also provided with a display device 48 on its front surface that is positioned in front of a user when the camera 46 is directed towards a real world scene. The display device 48 may be for example, a LCD screen that presents to the user images of a real world scene obtained by the camera 6, together with visual data, as explained below. The system 40 utilizes the camera 46, the display device 48, and the processor of the communication device 42, and further comprises one or more state sensors, contained within the housing of the communication device 42 which are not seen in FIG. 5. The processor is configured to identify in images captured by the image sensors 46 a predefined object pointing at a real world object in the real world scene.
  • FIG. 3 a shows the system 2 in use. The goggles 4 are placed over the eyes of a user 18. The user faces the real world scene 8 and thus views the scene 8. FIG. 3 b shows the system 40 in use. The user 18 holds the communication device 42 with the image sensors 46 facing the real world scene 8 and the display device 48 facing the user.
  • The system 2 or 40 now executes the following process. The view of the scene 8 that the user would see when using the system 2 or 40 is displayed on the display device. FIG. 4 a shows the view of the scene 8 that the user would see when using the system 2 or 40 to view the real world scene 8. The processor 36 analyzes images obtained by the image sensors to determine when a predefined object in images captured by the image sensors is performing a predefined gesture in relation to a real world object in the real world scene 8.
  • The viewing device 34, such as the goggles 4 or the communication device 42 is often not stable in use, due to movement of the user as occurs during walking, or movement of the user's head or hand. In this situation, the signal generated by the sensors 38 may be noisy and inaccurate. In this case, the machine vision module 37 runs a tracking algorithm that tracks the identified real world object and maintains the displayed associated visual data in a fixed position relative to the identified real world object.
  • The predefined gesture relating to a real world object or to an activatable icon, may be, for example, pointing at the real world object or an activatable icon, or performing a swiping gesture over the real world object or an activatable icon. The activatable icon may or may not be correlated to a real world object.
  • Other possible predefined gestures include a swiping motion, a pinching motion of two fingers such as with the fore finger and thumb or the middle finger and thumb, pointing, a left to right gesture, a right to left gesture, an upwards gesture, a downwards gesture, a pushing gesture, opening a clenched fist, opening a clenched first and moving towards the image sensor, a tapping gesture, a waving gesture, a clapping gesture, a reverse clapping gesture, closing a hand into a fist, a pinching gesture, a reverse pinching gesture, a gesture of splaying fingers on a hand, a reverse gesture of splaying fingers on a hand, pointing at an activatable icon, or at a real world object, pointing at an activatable icon or a real world object for a predefined amount of time, clicking on an activatable icon or real world object, double clicking on an activatable icon or real world object, clicking with a forefinger on an activatable icon or real world object, clicking with the middle finger on an activatable icon or real world object, clicking from the bottom on an activatable icon or real world object, clicking from the top on an activatable icon, grasping an activatable icon or real world object gesturing towards an activatable icon or real world object from the right, gesturing towards an activatable icon or real world object from the left, passing through an activatable icon or real world object from the left, pushing the activatable icon or real world object, clapping or waving over an activatable icon or real world object, performing a blast gesture, performing a tapping gesture, performing a clockwise or counter clockwise gesture over an activatable icon or real world object, sliding an activatable icon or real world object, grasping an activatable icon or real world object with two fingers, or performing a click-drag-release motion.
  • The predefined object may be, for example, a user hand, a part of a user's hand, such as the user's finger 20 or parts of two different hands. Alternatively, the predefined object may be a stylus or wand.
  • When the processor 16 determines that a predefined gesture has been performed, this may be indicated to the user by any type of feedback, such as visual feedback, audio feedback, haptic feedback, directional vibration, air tactile feedback, or ultrasonic feedback. The feedback may be a visual indication in a form selected from an activatable icon displayed on a display device, a change in an activatable icon on a display device, a change in color of an activatable icon on a display device, a change in size of an activatable icon, animation of an activatable icon, an indication light, an indicator moving on a display device, a vibration, a directional vibration indication, an air tactile indication. The indication may be provided by an indicator moving on a display device that appears on top of all other images or video appearing on the display device. Visual feedback may be the appearance of a glow around the predefined object when a system recognizes the predefined object.
  • The gesture detection module 40 may use any method for detecting the predefined objects in images obtained by the image sensor 32. For example, the gesture detection module may detect the predefined object as disclosed in WO2005/091125 or WO 2010/086866.
  • The processor 16 is further configured to determine the real world object in the scene 8 towards which the predefined gesture was performed. Thus, for example, in the image shown in FIG. 4 b, the processor 16 would determine that the user's finger 20 is pointing at the billboard 14 by determining the fingertip location (X,Y) in the image and combining this information with the location of the user and the orientation of the goggles 4 from the state sensors 21. The real world object is thus indentified by the processor without presenting to the user a cursor or other marker to indicate the real world object that the user wishes to select, enabling a direct pointing on a real world object to start an interaction. The processor 16 searches in a memory, which may be integral with the processor 16 or may be remotely located, for data relating to the real-world object to which the user's finger 20 is pointing. For example, the memory may have stored data relating to the billboard 14. When the user points to an object in the scene 8 whose data is stored in the memory or is extracted from a remote server such as an Internet site, the data is displayed on the display device 10 superimposed on the user's view of the scene. Thus, when the user points to the billboard 14 (FIG. 3), visual data 21 relating to the billboard 14 is displayed on the display device 10, as shown in FIG. 4 c.
  • The visual data 21 may be static or animated. The visual data 21 may include one or more an activatable icons, such that when a predefined gesture is performed relative to one of the activatable icons, a command associated with the activatable icon is executed. The command may be, for example, to display specific visual material relating to the selected real world object. The activatable icons may be 2D or 3D activatable icons and may be presented to the user so that the user perceives the icon in front of him in a 3D space. As used herein, an activatable icon is a region in a 2D or 3D image or video associated with one or more messages activated by user interaction. The activatable icons may be, for example, a 2D or 3D visual element. The activatable icons may be virtual buttons, a virtual keyboard, a 2D or 3D activatable icon, a region in an image or a video. An activatable icon may consist of two or more activatable icons.
  • The processor may not present part of the activatable icon where the predefined object is located, so that the predefined object appears to be on top of the activatable icon. The activatable icons may be removed when the user rapidly moves his head and then returned when the head motion is below a predefined motion speed.
  • The system 2 may have two or more operational modes and the processor 16 may be configured to identify one or more predefined gestures to change between the operational modes. Thus, a gesture may be used to turn the system on or off, select the source of the visual material to be presented, select the level of details of the visual material to be presented, select the buttons or activatable icons to be presented to the user, or activate an online service, such as an online service related to a selected real world object. Yet another mode of operation may be to start video recording of images by the image sensor and/or recording of sounds by a microphone upon identification of a predefined gesture and to stop recording upon identification of another predefined gesture. Yet another mode of operation is continuously monitoring video and/or sound, but following a detection of a predefined gesture, the video/sound is recorded starting from a predetermined amount of time prior to identification of the gesture, and stopping the recording after identification of another predefined gesture. The predetermined time may be defined by the user. Yet another mode of operation is adding tags in a captured and real-time recorded video upon identification of a predefined gesture.
  • Yet another mode of operation is shown in FIG. 6. In FIG. 6 a, an area 62 in the field of view 60 as captured by the image sensor is designated by the user performing a gesture of “drawing” the contour of the area, shown by phantom lines in FIG. 6. The selected area is then resized by the user performing a second gesture, such as separating two fingers or bringing two fingers closer together as indicated by the arrows 66 in FIG. 6 b, until the selected area attains the desired size (67 in FIG. 6 c). The area 67 is then dragged to a new location in the field of view (FIG. 6 d) and copied in the new location in the field of view. The system then employs a tracker on the selected area and the selected area is presented in real-time in the resized and relocated area set by the user on the display device.
  • In order to minimize CPU resources, for each displayed activatable icon, a region of images containing a displayed activatable icon bounding box around a displayed activatable icon may be defined that remains fixed. The system employs a machine vision tracker to track this bounding box. The distance between the locations of the bounding boxes in two frames of a video sequence is less than a predefined distance, as determined using a video tracker, and the correlation value of the tracker of the bounding box is below a predefined value.
  • When the system is in an operational mode in which only activatable icons may be activated, and real world objects cannot be activated, CPU can be minimized by searching for the predefined object only in the vicinity of each displayed activatable icon. In order to reduce CPU even further, the objection recognition module is not activated all the time but only when the headset is not moving significantly as determined from information obtained by a state sensors.
  • A user may choose different filters to screen data correlated with real-world objects, such as a filter “display data generated only by friends”, or display data from registered sources, or data generated in the last three months.
  • The system 2 may have a stand-by mode in which the power consumption by the system 2 is minimal. The active mode may be different from the stand-by mode, for example, in the number of video frames per second that are being analyzed by the system, the resolution of images that are being analyzed, the portion of the image frame that is being analyzed, and/or the detection modules that are activated. The system 2 can be brought to the active mode by any technique. For example, the system 2 may be brought to the active mode by bringing the predefined object into the field of view from below, when user place the predefined object in certain location or posse, such as pointing on the bottom right corner of the camera field of view or open his hand in the camera field of view, perform a predefined gesture such as moving the hand from right to left across the field of view, when an activatable icon is displayed and the user performs a predefined gesture correlate to the activatable icon such as pointing on the activatable icon, or perform a waving gesture in the location where the activatable icon is presented, or sliding the floating activatable icon from one location to the other by perform a gesture in the 3D space where the activatable icon is perceived to be located, by touching the device, or tapping on the device if the device is provided with an accelerometer. As yet another example, the system may enter the active mode when the user passes a hand near the device if the device is provided with a proximity sensor, or ultrasonic sensor. The system may also be activated by a voice command, or when the user places the predefined object in a particular location in the field of view. As yet another example, the system may enter the active mode only when there is relevant data associated with the real world in the field of view of the user. When the system may indicate to the user when there is a relevant data to be presented, or when it is ready for interaction.
  • A visual indication may be attached to a real-world object to let the user know that there is data correlated with the real-world object.
  • Indication of relevant data may be overlaid on the location of the real-world object as a small visual indication such as an activatable icon of “i” may indicate information, and a logo of “photos” may indicate images related to the real-world object, or a logo of “envelop” indicates a message that was left by a friend or other user correlated to the real-world object. When the user performs a predefined gesture correlated to the activatable icon, the data may be presented.
  • The system 2 may be configured to undergo a calibration process to record various physical parameters of the predefined object so as to facilitate identification of the predefined object in images obtained by the camera by the processor 2. This may be done, for example, by presenting to the user on the display activatable icons in different locations in the 3D space, and extracting physical features of the predefined object such as its size or orientation of the predefined object, and determining a correlation between the dimensions of the predefined object and its distance from the camera. The calibration may involve calculating the triangular of camera, the user's line of sight and the tip of the predefined object to determine the user is pointing at. The accuracy is improved by estimating the distance of the real world object from the camera based on information extracted in the calibration.
  • The processor may be configured to identify in images obtained by the camera of the real world scene by another user of the system of the invention. The identification of another user in the real world scene may be performed, for example, by informing a remote server of the locations of the devices in a particular geographical area. The locations of the other devices can be sent to all of the devices in the geographical area.
  • When a communication link exists between two systems of the invention, the two systems may be used for game playing. The other user may be represented to as an avatar with whom the user can interact by gestures such as send a message to the other user such as “like”.
  • The processor may be configured to display a keyboard that enables text typing with one or more fingers or hands. Display of the keyboard may be initiated upon detection of a predefined gesture such as a gesture from right to left, or by the using presenting an open hand, or two open hands in a predefined region of the field of view of the camera, such as the bottom part of the field of view. Yet another way to initiate the display of the keyboard is when the user performs a click gesture in the 3D space where the typing area or an activatable icon is perceived to be located. The keyboard may be used, for example, in order to, write a note, conduct a search or to communicate with online services (such as Skype or twitter) by typing on virtual keyboard. The system may not present part of the keyboard where the predefined object is located, so that the predefined object appears to be on top of the keyboard to create the illusion that the predefined object such as a user's hand appears to be “over” the keyboard.
  • When the system is in a typing mode, an animated hand may be presented on the keyboard whose position is correlated with the user's hands and fingers. The fingertips of the animated hands may be located above a virtual keystroke at the location where the character of the keystroke is seen. The keyboard and the animated hands are preferably opaque, so that the user is unable see the background behind the keyboard. This tends to make the keyboard clearer to the user.

Claims (31)

1-56. (canceled)
57. An augmented reality device, comprising:
at least one processor configured to:
receive, from an image sensor, image information associated with a scene;
output, to a display, visual data to enable a user to view both the visual data and at least a portion of the scene; and
detect, in the image information, a predefined gesture performed by the user;
correlate the predefined gesture with the visual data; and
alter the displayed visual data based on the correlation.
58. The augmented reality device of claim 57, wherein the visual data includes at least one of information associated with one or more objects in the scene, images associated with the scene, and one or more distances associated with the scene.
59. The augmented reality device of claim 57, wherein, to correlate the predefined gesture with the visual data, the at least one processor is configured to:
determine a reference location in three dimensional space of at leas portion of a hand of the user; and
determine in at least one of the visual data and the image information, data associated with the reference location.
60. The augmented reality device of claim 59, wherein, to alter the displayed visual data, the at least one processor is configured to change the visual data as a function of the data associated with the reference location.
61. The augmented reality device of claim 57, wherein, the predefined gesture is a hand gesture.
62. The augmented reality device of claim 61, wherein the hand gesture includes a pointing finger.
63. The augmented reality device of claim 57, wherein the at least one processor is further configured to output at least one of a message and a command based on the correlation.
64. The augmented reality device of claim 57, wherein the predefined gesture is performed by a predefined object, and further wherein the at least one processor is further configured to:
calibrate one or more parameters associated with the predefined object; and
detect the predefined gesture using the one or more parameters.
65. The augmented reality device of claim 57, wherein the at least one processor is further configured to enable a user to choose one or more filters that screen what data is selected as the visual data.
66. The augmented reality device of claim 57, wherein the at least one processor is further configured to:
detect at least one object pointed to by the user; and
determine the visual data based on the detected object.
67. The augmented reality device of claim 57, wherein the display is a head-mounted display.
68. The augmented reality device of claim 67, wherein the at least one processor is further configured to:
determine an orientation of the head-mounted display;
detect, using the determined orientation, at least one object pointed to by the user; and
determine the visual data based on the detected object.
69. The augmented reality device of claim 67, wherein the at least one processor is further configured to:
determine an orientation of the head-mounted display;
determine at least one fingertip location in the image information;
detect, using the determined orientation and the at least one fingertip location, at least one object pointed to by the user; and
determine the visual data based on the detected object.
70. The augmented reality device of claim 57, wherein the predefined gesture is performed by a predefined object, and further wherein, to output the visual data to the display, the at least one processor is further configured to prevent data from being displayed at one or more locations associated with the predefined object.
71. The augmented reality device of claim 57, wherein the at least one processor is further configured to:
stop display of the visual data based on a head motion of the user; and
resume display of the visual data when the head motion is below a predefined motion speed.
72. The augmented reality device of claim 57, wherein the visual data provides an indication that information related to an object in the scene is available.
73. The augmented reality device of claim 57, wherein to output the visual data to the display, the at least one processor is further configured to superimpose the visual data on at least a portion of the user's view of the scene.
74. A non-transitory computer-readable medium comprising instructions that, when executed by at least one processor, cause the at least one processor to perform operations including:
receiving, from an image sensor, image information associated with a scene;
outputting, to a display, visual data to enable a user to view both the visual data and at least a portion of the scene; and
detecting, in the image information, a predefined gesture performed by the user;
correlating the predefined gesture with the visual data; and
altering the displayed visual data based on the correlation.
75. The non-transitory computer-readable medium of claim 74, wherein the visual data includes at least one of information associated with one or more objects in the scene, images associated with the scene, and one or more distances associated with the scene.
76. The non-transitory compute readable medium of claim 74, wherein the instructions, when executed by the at least one processor, further cause the at least one processor to perform operations including:
determining a reference location in three dimensional space of at least a portion of a hand of the user; and
determining, in at least one of the visual data and the image information, data associated with the reference location.
77. The non-transitory computer-readable medium of claim 74, wherein the predefined gesture is performed by a predefined object, and further wherein the instructions, when executed by the at least one processor, further cause the at least one processor to perform operations including:
calibrating one or more parameters associated with the predefined object; and
detecting the predefined gesture using the one or more parameters.
78. The non-transitory computer-readable medium of claim 74, wherein the instructions, when executed by the at least one processor, further cause the at least one processor to perform an operation including enabling a user to choose one or more filters that screen what data is selected as the visual data.
79. The non-transitory computer-readable medium of claim 74, wherein the instructions, when executed by the at least one processor, further cause the at least one processor to perform operations including:
detecting at least one object pointed to by the user; and
determining the visual data based on the detected object.
80. The non-transitory computer-readable medium of claim 74, wherein the display is a head-mounted display.
81. The non-transitory computer-readable medium of claim 80, wherein the instructions, when executed by the at least one processor, further cause the at least one processor to perform operations including:
determining an orientation of the head-mounted display;
detecting, using the determined orientation, at least one object pointed to by the user; and
determining the visual data based on the detected object.
82. The non-transitory computer-readable medium of claim 80, wherein the instructions, when executed by the at least one processor, further cause the at least one processor to perform operations including:
determining an orientation of the head-mounted display;
determining at least one fingertip location in the image information;
detecting, using the determined orientation and the at least one fingertip location, at least one object pointed to by the user; and
determining the visual data based on the detected object.
83. The non-transitory computer-readable medium of claim 74, wherein the predefined gesture is performed by a predefined object, and further wherein the instructions, when executed by the at least one processor, further cause the at least one processor to perform an operation including preventing data from being displayed at one or more locations associated with the predefined object.
84. The non-transitory computer-readable medium of claim 74, wherein the instructions, when executed by the at least one processor, further cause the at least one processor to perform operations including:
stopping display of the visual data based on a head motion of the user; and
resuming display of the visual data when the head motion is below a predefined motion speed.
85. The non-transitory computer-readable medium of claim 74, wherein the instructions, when executed by the at least one processor, further cause the at least one processor to perform an operation including superimposing the visual data on at least a portion of the user's view of the scene.
86. A augmented reality method, comprising:
receiving, from an image sensor, image information associated with a scene;
outputting, to a display, visual data to enable a user to view both the visual data and at least a portion of the scene; and
detecting, in the image information, a predefined gesture performed by the user;
correlating the predefined gesture with the visual data; and
altering the displayed visual data based on the correlation.
US14/345,592 2011-09-19 2012-09-19 Touch Free Interface for Augmented Reality Systems Abandoned US20140361988A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/345,592 US20140361988A1 (en) 2011-09-19 2012-09-19 Touch Free Interface for Augmented Reality Systems

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201161536144P 2011-09-19 2011-09-19
US14/345,592 US20140361988A1 (en) 2011-09-19 2012-09-19 Touch Free Interface for Augmented Reality Systems
PCT/IL2012/050376 WO2013093906A1 (en) 2011-09-19 2012-09-19 Touch free interface for augmented reality systems

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2012/050376 A-371-Of-International WO2013093906A1 (en) 2011-09-19 2012-09-19 Touch free interface for augmented reality systems

Related Child Applications (5)

Application Number Title Priority Date Filing Date
US15/060,533 Continuation US20160259423A1 (en) 2011-09-19 2016-03-03 Touch fee interface for augmented reality systems
US15/090,527 Continuation US20160291699A1 (en) 2011-09-19 2016-04-04 Touch fee interface for augmented reality systems
US15/096,674 Continuation US20160306433A1 (en) 2011-09-19 2016-04-12 Touch fee interface for augmented reality systems
US15/144,209 Continuation US10401967B2 (en) 2011-09-19 2016-05-02 Touch free interface for augmented reality systems
US15/256,481 Continuation US20170052599A1 (en) 2011-09-19 2016-09-02 Touch Free Interface For Augmented Reality Systems

Publications (1)

Publication Number Publication Date
US20140361988A1 true US20140361988A1 (en) 2014-12-11

Family

ID=47189999

Family Applications (8)

Application Number Title Priority Date Filing Date
US14/345,592 Abandoned US20140361988A1 (en) 2011-09-19 2012-09-19 Touch Free Interface for Augmented Reality Systems
US15/060,533 Abandoned US20160259423A1 (en) 2011-09-19 2016-03-03 Touch fee interface for augmented reality systems
US15/090,527 Abandoned US20160291699A1 (en) 2011-09-19 2016-04-04 Touch fee interface for augmented reality systems
US15/096,674 Abandoned US20160306433A1 (en) 2011-09-19 2016-04-12 Touch fee interface for augmented reality systems
US15/144,209 Expired - Fee Related US10401967B2 (en) 2011-09-19 2016-05-02 Touch free interface for augmented reality systems
US15/256,481 Abandoned US20170052599A1 (en) 2011-09-19 2016-09-02 Touch Free Interface For Augmented Reality Systems
US16/557,183 Active US11093045B2 (en) 2011-09-19 2019-08-30 Systems and methods to augment user interaction with the environment outside of a vehicle
US17/401,427 Active US11494000B2 (en) 2011-09-19 2021-08-13 Touch free interface for augmented reality systems

Family Applications After (7)

Application Number Title Priority Date Filing Date
US15/060,533 Abandoned US20160259423A1 (en) 2011-09-19 2016-03-03 Touch fee interface for augmented reality systems
US15/090,527 Abandoned US20160291699A1 (en) 2011-09-19 2016-04-04 Touch fee interface for augmented reality systems
US15/096,674 Abandoned US20160306433A1 (en) 2011-09-19 2016-04-12 Touch fee interface for augmented reality systems
US15/144,209 Expired - Fee Related US10401967B2 (en) 2011-09-19 2016-05-02 Touch free interface for augmented reality systems
US15/256,481 Abandoned US20170052599A1 (en) 2011-09-19 2016-09-02 Touch Free Interface For Augmented Reality Systems
US16/557,183 Active US11093045B2 (en) 2011-09-19 2019-08-30 Systems and methods to augment user interaction with the environment outside of a vehicle
US17/401,427 Active US11494000B2 (en) 2011-09-19 2021-08-13 Touch free interface for augmented reality systems

Country Status (5)

Country Link
US (8) US20140361988A1 (en)
JP (3) JP2014531662A (en)
KR (3) KR20220032059A (en)
CN (2) CN115167675A (en)
WO (1) WO2013093906A1 (en)

Cited By (123)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140028716A1 (en) * 2012-07-30 2014-01-30 Mitac International Corp. Method and electronic device for generating an instruction in an augmented reality environment
US20140125580A1 (en) * 2012-11-02 2014-05-08 Samsung Electronics Co., Ltd. Method and device for providing information regarding an object
US20140225918A1 (en) * 2013-02-14 2014-08-14 Qualcomm Incorporated Human-body-gesture-based region and volume selection for hmd
US20140240226A1 (en) * 2013-02-27 2014-08-28 Robert Bosch Gmbh User Interface Apparatus
US20140270352A1 (en) * 2013-03-14 2014-09-18 Honda Motor Co., Ltd. Three dimensional fingertip tracking
US20140285520A1 (en) * 2013-03-22 2014-09-25 Industry-University Cooperation Foundation Hanyang University Wearable display device using augmented reality
US20140306881A1 (en) * 2013-04-15 2014-10-16 Olympus Corporation Wearable device, program and display controlling method of wearable device
US20140358002A1 (en) * 2011-12-23 2014-12-04 Koninklijke Philips N.V. Method and apparatus for interactive display of three dimensional ultrasound images
US20150146007A1 (en) * 2013-11-26 2015-05-28 Honeywell International Inc. Maintenance assistant system
US20150243105A1 (en) * 2013-07-12 2015-08-27 Magic Leap, Inc. Method and system for interacting with user interfaces
US20150271396A1 (en) * 2014-03-24 2015-09-24 Samsung Electronics Co., Ltd. Electronic device and method for image data processing
US20150356789A1 (en) * 2013-02-21 2015-12-10 Fujitsu Limited Display device and display method
US20150358614A1 (en) * 2014-06-05 2015-12-10 Samsung Electronics Co., Ltd. Wearable device and method for providing augmented reality information
US9213403B1 (en) 2013-03-27 2015-12-15 Google Inc. Methods to pan, zoom, crop, and proportionally move on a head mountable display
US20150364037A1 (en) * 2014-06-12 2015-12-17 Lg Electronics Inc. Mobile terminal and control system
US20160048203A1 (en) * 2014-08-18 2016-02-18 Universal City Studios Llc Systems and methods for generating augmented and virtual reality images
US20160048024A1 (en) * 2014-08-13 2016-02-18 Beijing Lenovo Software Ltd. Information processing method and electronic device
US20160054802A1 (en) * 2014-08-21 2016-02-25 Samsung Electronics Co., Ltd. Sensor based ui in hmd incorporating light turning element
US20160103437A1 (en) * 2013-06-27 2016-04-14 Abb Technology Ltd Method and data presenting device for assisting a remote user to provide instructions
CN105843390A (en) * 2016-02-24 2016-08-10 上海理湃光晶技术有限公司 Method for image scaling and AR (Augmented Reality) glasses based on method
JP2016161734A (en) * 2015-03-02 2016-09-05 セイコーエプソン株式会社 Display device, method for controlling display device, and program
US20160313956A1 (en) * 2015-04-24 2016-10-27 Panasonic Intellectual Property Corporation Of America Head-mounted display apparatus worn on user's head
US9507426B2 (en) * 2013-03-27 2016-11-29 Google Inc. Using the Z-axis in user interfaces for head mountable displays
US9526443B1 (en) * 2013-01-19 2016-12-27 Bertec Corporation Force and/or motion measurement system and a method of testing a subject
WO2016206874A1 (en) * 2015-06-23 2016-12-29 Siemens Aktiengesellschaft Interaction system
US20160378294A1 (en) * 2015-06-24 2016-12-29 Shawn Crispin Wright Contextual cursor display based on hand tracking
US20170061700A1 (en) * 2015-02-13 2017-03-02 Julian Michael Urbach Intercommunication between a head mounted display and a real world object
US9685005B2 (en) * 2015-01-02 2017-06-20 Eon Reality, Inc. Virtual lasers for interacting with augmented reality environments
WO2017112099A1 (en) * 2015-12-23 2017-06-29 Intel Corporation Text functions in augmented reality
US9770203B1 (en) 2013-01-19 2017-09-26 Bertec Corporation Force measurement system and a method of testing a subject
WO2017172211A1 (en) * 2016-03-31 2017-10-05 Intel Corporation Augmented reality in a field of view including a reflection
ES2643863A1 (en) * 2016-05-24 2017-11-24 Sonovisión Ingenieros España, S.A.U. Method to provide through guided augmented reality,\rinspection and support in installation or maintenance of processes\rfor complex assemblies compatible with s1000d and device that makes use of the same (Machine-translation by Google Translate, not legally binding)
WO2018071019A1 (en) * 2016-10-13 2018-04-19 Ford Motor Company Dual-mode augmented reality interfaces for mobile devices
US9990779B2 (en) * 2016-05-13 2018-06-05 Meta Company System and method for modifying virtual objects in a virtual environment in response to user interactions
US10010286B1 (en) 2013-01-19 2018-07-03 Bertec Corporation Force measurement system
US20180189474A1 (en) * 2016-12-30 2018-07-05 Baidu Online Network Technology (Beijing) Co., Ltd. Method and Electronic Device for Unlocking Electronic Device
CN108369640A (en) * 2015-12-17 2018-08-03 诺基亚技术有限公司 For control scene capture images image procossing to adjust the method, apparatus or computer program of capture images
US20180225520A1 (en) * 2015-02-23 2018-08-09 Vivint, Inc. Techniques for identifying and indexing distinguishing features in a video feed
US10055888B2 (en) 2015-04-28 2018-08-21 Microsoft Technology Licensing, Llc Producing and consuming metadata within multi-dimensional data
US10068403B1 (en) 2017-09-21 2018-09-04 Universal City Studios Llc Locker management techniques
US10133345B2 (en) 2016-03-22 2018-11-20 Microsoft Technology Licensing, Llc Virtual-reality navigation
US10156726B2 (en) * 2015-06-29 2018-12-18 Microsoft Technology Licensing, Llc Graphene in optical systems
US10168768B1 (en) 2016-03-02 2019-01-01 Meta Company Systems and methods to facilitate interactions in an interactive space
US10186088B2 (en) 2016-05-13 2019-01-22 Meta Company System and method for managing interactive virtual frames for virtual objects in a virtual environment
US10231662B1 (en) 2013-01-19 2019-03-19 Bertec Corporation Force measurement system
US20190129179A1 (en) * 2017-11-02 2019-05-02 Olympus Corporation Work assistance apparatus, work assistance method, and computer-readable, non-transitory recording medium recording work assistance program executed by computer
US10310624B2 (en) * 2014-12-17 2019-06-04 Konica Minolta, Inc. Electronic apparatus, method for controlling electronic apparatus, and control program for the same
US10405024B2 (en) * 2016-10-26 2019-09-03 Orcam Technologies Ltd. Wearable device and methods for transmitting information based on physical distance
US10413230B1 (en) 2013-01-19 2019-09-17 Bertec Corporation Force measurement system
US20190339837A1 (en) * 2018-05-04 2019-11-07 Oculus Vr, Llc Copy and Paste in a Virtual Reality Environment
US10481755B1 (en) * 2017-04-28 2019-11-19 Meta View, Inc. Systems and methods to present virtual content in an interactive space
US10602200B2 (en) 2014-05-28 2020-03-24 Lucasfilm Entertainment Company Ltd. Switching modes of a media content item
US10606359B2 (en) 2014-12-19 2020-03-31 Immersion Corporation Systems and methods for haptically-enabled interactions with objects
US10620779B2 (en) * 2017-04-24 2020-04-14 Microsoft Technology Licensing, Llc Navigating a holographic image
US10646153B1 (en) 2013-01-19 2020-05-12 Bertec Corporation Force measurement system
US10656711B2 (en) 2016-07-25 2020-05-19 Facebook Technologies, Llc Methods and apparatus for inferring user intent based on neuromuscular signals
US10684367B2 (en) 2014-11-26 2020-06-16 Samsung Electronics Co., Ltd. Ultrasound sensor and object detecting method thereof
US10687759B2 (en) 2018-05-29 2020-06-23 Facebook Technologies, Llc Shielding techniques for noise reduction in surface electromyography signal measurement and related systems and methods
US10768426B2 (en) 2018-05-21 2020-09-08 Microsoft Technology Licensing, Llc Head mounted display system receiving three-dimensional push notification
US10789952B2 (en) * 2018-12-20 2020-09-29 Microsoft Technology Licensing, Llc Voice command execution from auxiliary input
US10842407B2 (en) 2018-08-31 2020-11-24 Facebook Technologies, Llc Camera-guided interpretation of neuromuscular signals
US10856796B1 (en) 2013-01-19 2020-12-08 Bertec Corporation Force measurement system
US10909767B1 (en) * 2019-08-01 2021-02-02 International Business Machines Corporation Focal and interaction driven content replacement into augmented reality
US10909762B2 (en) 2018-08-24 2021-02-02 Microsoft Technology Licensing, Llc Gestures for facilitating interaction with pages in a mixed reality environment
US10905383B2 (en) 2019-02-28 2021-02-02 Facebook Technologies, Llc Methods and apparatus for unsupervised one-shot machine learning for classification of human gestures and estimation of applied forces
US10921764B2 (en) 2018-09-26 2021-02-16 Facebook Technologies, Llc Neuromuscular control of physical objects in an environment
US10929099B2 (en) * 2018-11-02 2021-02-23 Bose Corporation Spatialized virtual personal assistant
US10937414B2 (en) 2018-05-08 2021-03-02 Facebook Technologies, Llc Systems and methods for text input using neuromuscular information
US10970936B2 (en) 2018-10-05 2021-04-06 Facebook Technologies, Llc Use of neuromuscular signals to provide enhanced interactions with physical objects in an augmented reality environment
US10970374B2 (en) 2018-06-14 2021-04-06 Facebook Technologies, Llc User identification and authentication with neuromuscular signatures
US10990174B2 (en) 2016-07-25 2021-04-27 Facebook Technologies, Llc Methods and apparatus for predicting musculo-skeletal position information using wearable autonomous sensors
CN112789577A (en) * 2018-09-20 2021-05-11 脸谱科技有限责任公司 Neuromuscular text input, writing and drawing in augmented reality systems
US20210158630A1 (en) * 2019-11-25 2021-05-27 Facebook Technologies, Llc Systems and methods for contextualized interactions with an environment
US11036302B1 (en) 2018-05-08 2021-06-15 Facebook Technologies, Llc Wearable devices and methods for improved speech recognition
US11045137B2 (en) 2018-07-19 2021-06-29 Facebook Technologies, Llc Methods and apparatus for improved signal robustness for a wearable neuromuscular recording device
US11052288B1 (en) 2013-01-19 2021-07-06 Bertec Corporation Force measurement system
US11069148B2 (en) 2018-01-25 2021-07-20 Facebook Technologies, Llc Visualization of reconstructed handstate information
US11079846B2 (en) 2013-11-12 2021-08-03 Facebook Technologies, Llc Systems, articles, and methods for capacitive electromyography sensors
US11099716B2 (en) 2016-12-23 2021-08-24 Realwear, Inc. Context based content navigation for wearable display
EP3847531A4 (en) * 2019-01-31 2021-11-03 Huawei Technologies Co., Ltd. Hand-over-face input sensing for interaction with a device having a built-in camera
US11179066B2 (en) 2018-08-13 2021-11-23 Facebook Technologies, Llc Real-time spike detection and identification
US11216069B2 (en) 2018-05-08 2022-01-04 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
CN114089879A (en) * 2021-11-15 2022-02-25 北京灵犀微光科技有限公司 Cursor control method of augmented reality display equipment
US20220103748A1 (en) * 2020-09-28 2022-03-31 Ilteris Canberk Touchless photo capture in response to detected hand gestures
US20220113814A1 (en) 2019-09-30 2022-04-14 Yu Jiang Tham Smart ring for manipulating virtual objects displayed by a wearable device
US11311209B1 (en) 2013-01-19 2022-04-26 Bertec Corporation Force measurement system and a motion base used therein
US11321880B2 (en) 2017-12-22 2022-05-03 Sony Corporation Information processor, information processing method, and program for specifying an important region of an operation target in a moving image
US11337652B2 (en) 2016-07-25 2022-05-24 Facebook Technologies, Llc System and method for measuring the movements of articulated rigid bodies
US11354896B2 (en) 2019-07-31 2022-06-07 Seiko Epson Corporation Display device, display method, and computer program
US11409497B2 (en) 2016-12-23 2022-08-09 Realwear, Inc. Hands-free navigation of touch-based operating systems
US11435857B1 (en) * 2021-04-29 2022-09-06 Google Llc Content access and navigation using a head-mounted device
US11481031B1 (en) 2019-04-30 2022-10-25 Meta Platforms Technologies, Llc Devices, systems, and methods for controlling computing devices via neuromuscular signals of users
US11481030B2 (en) 2019-03-29 2022-10-25 Meta Platforms Technologies, Llc Methods and apparatus for gesture detection and classification
US20220350997A1 (en) * 2021-04-29 2022-11-03 Google Llc Pointer-based content recognition using a head-mounted device
US11493993B2 (en) 2019-09-04 2022-11-08 Meta Platforms Technologies, Llc Systems, methods, and interfaces for performing inputs based on neuromuscular control
US11507216B2 (en) * 2016-12-23 2022-11-22 Realwear, Inc. Customizing user interfaces of binary applications
US11520399B2 (en) 2020-05-26 2022-12-06 Snap Inc. Interactive augmented reality experiences using positional tracking
US11531402B1 (en) 2021-02-25 2022-12-20 Snap Inc. Bimanual gestures for controlling virtual and graphical elements
US11540744B1 (en) 2013-01-19 2023-01-03 Bertec Corporation Force measurement system
US11635736B2 (en) 2017-10-19 2023-04-25 Meta Platforms Technologies, Llc Systems and methods for identifying biological structures associated with neuromuscular source signals
US20230131474A1 (en) * 2019-10-16 2023-04-27 Samuel R. Pecota Augmented reality marine navigation
US11644902B2 (en) * 2020-11-30 2023-05-09 Google Llc Gesture-based content transfer
US11644799B2 (en) 2013-10-04 2023-05-09 Meta Platforms Technologies, Llc Systems, articles and methods for wearable electronic devices employing contact sensors
US11656686B2 (en) * 2014-09-09 2023-05-23 Ultrahaptics Ip Ltd Method and apparatus for modulating haptic feedback
US11666264B1 (en) 2013-11-27 2023-06-06 Meta Platforms Technologies, Llc Systems, articles, and methods for electromyography sensors
US11740313B2 (en) 2020-12-30 2023-08-29 Snap Inc. Augmented reality precision tracking and display
US20230333378A1 (en) * 2017-08-25 2023-10-19 Snap Inc. Wristwatch based interface for augmented reality eyewear
US11797087B2 (en) 2018-11-27 2023-10-24 Meta Platforms Technologies, Llc Methods and apparatus for autocalibration of a wearable electrode sensor system
US11798429B1 (en) 2020-05-04 2023-10-24 Snap Inc. Virtual tutorials for musical instruments with finger tracking in augmented reality
US11804041B2 (en) 2018-12-04 2023-10-31 Apple Inc. Method, device, and system for generating affordances linked to a representation of an item
US11816267B2 (en) 2020-06-23 2023-11-14 Ultraleap Limited Features of airborne ultrasonic fields
US11830351B2 (en) 2015-02-20 2023-11-28 Ultrahaptics Ip Ltd Algorithm improvements in a haptic system
US11842517B2 (en) 2019-04-12 2023-12-12 Ultrahaptics Ip Ltd Using iterative 3D-model fitting for domain adaptation of a hand-pose-estimation neural network
USD1009884S1 (en) * 2019-07-26 2024-01-02 Sony Corporation Mixed reality eyeglasses or portion thereof with an animated graphical user interface
US11861070B2 (en) 2021-04-19 2024-01-02 Snap Inc. Hand gestures for animating and controlling virtual and graphical elements
US11857331B1 (en) 2013-01-19 2024-01-02 Bertec Corporation Force measurement system
US11868531B1 (en) 2021-04-08 2024-01-09 Meta Platforms Technologies, Llc Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof
US11883847B2 (en) 2018-05-02 2024-01-30 Ultraleap Limited Blocking plate structure for improved acoustic transmission efficiency
US11886639B2 (en) 2020-09-17 2024-01-30 Ultraleap Limited Ultrahapticons
US11921928B2 (en) 2017-11-26 2024-03-05 Ultrahaptics Ip Ltd Haptic effects from focused acoustic fields
US11921471B2 (en) 2013-08-16 2024-03-05 Meta Platforms Technologies, Llc Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source
US11925863B2 (en) 2020-09-18 2024-03-12 Snap Inc. Tracking hand gestures for interactive game control in augmented reality
US11961494B1 (en) 2020-03-27 2024-04-16 Meta Platforms Technologies, Llc Electromagnetic interference reduction in extended reality environments

Families Citing this family (114)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9865125B2 (en) 2010-11-15 2018-01-09 Bally Gaming, Inc. System and method for augmented reality gaming
KR20220032059A (en) 2011-09-19 2022-03-15 아이사이트 모빌 테크놀로지 엘티디 Touch free interface for augmented reality systems
US11068049B2 (en) 2012-03-23 2021-07-20 Microsoft Technology Licensing, Llc Light guide display and field of view
US10191515B2 (en) 2012-03-28 2019-01-29 Microsoft Technology Licensing, Llc Mobile device light guide display
US9558590B2 (en) 2012-03-28 2017-01-31 Microsoft Technology Licensing, Llc Augmented reality light guide display
US9717981B2 (en) 2012-04-05 2017-08-01 Microsoft Technology Licensing, Llc Augmented reality and physical games
US10502876B2 (en) 2012-05-22 2019-12-10 Microsoft Technology Licensing, Llc Waveguide optics focus elements
US10192358B2 (en) 2012-12-20 2019-01-29 Microsoft Technology Licensing, Llc Auto-stereoscopic augmented reality display
US9401850B2 (en) 2013-05-08 2016-07-26 Vringo Infrastructure Inc. Cognitive radio system and cognitive radio carrier device
US9672627B1 (en) * 2013-05-09 2017-06-06 Amazon Technologies, Inc. Multiple camera based motion tracking
KR102157313B1 (en) * 2013-09-03 2020-10-23 삼성전자주식회사 Method and computer readable recording medium for recognizing an object using a captured image
KR102165818B1 (en) * 2013-09-10 2020-10-14 삼성전자주식회사 Method, apparatus and recovering medium for controlling user interface using a input image
JP5877824B2 (en) * 2013-09-20 2016-03-08 ヤフー株式会社 Information processing system, information processing method, and information processing program
KR102119659B1 (en) 2013-09-23 2020-06-08 엘지전자 주식회사 Display device and control method thereof
CN103501473B (en) * 2013-09-30 2016-03-09 陈创举 Based on multifunctional headphone and the control method thereof of MEMS sensor
KR101499044B1 (en) * 2013-10-07 2015-03-11 홍익대학교 산학협력단 Wearable computer obtaining text based on gesture and voice of user and method of obtaining the text
US9671826B2 (en) * 2013-11-27 2017-06-06 Immersion Corporation Method and apparatus of body-mediated digital content transfer and haptic feedback
US9264479B2 (en) * 2013-12-30 2016-02-16 Daqri, Llc Offloading augmented reality processing
US10586395B2 (en) 2013-12-30 2020-03-10 Daqri, Llc Remote object detection and local tracking using visual odometry
EP2899609B1 (en) * 2014-01-24 2019-04-17 Sony Corporation System and method for name recollection
DE102014201578A1 (en) * 2014-01-29 2015-07-30 Volkswagen Ag Device and method for signaling an input area for gesture recognition of a human-machine interface
US20150227231A1 (en) * 2014-02-12 2015-08-13 Microsoft Corporation Virtual Transparent Display
WO2015161062A1 (en) * 2014-04-18 2015-10-22 Bally Gaming, Inc. System and method for augmented reality gaming
US9501871B2 (en) 2014-04-30 2016-11-22 At&T Mobility Ii Llc Explorable augmented reality displays
TWI518603B (en) 2014-05-22 2016-01-21 宏達國際電子股份有限公司 Image capturing method and electric device using the same
EP3180676A4 (en) * 2014-06-17 2018-01-10 Osterhout Group, Inc. External user interface for head worn computing
JP6500355B2 (en) * 2014-06-20 2019-04-17 富士通株式会社 Display device, display program, and display method
US20150379770A1 (en) * 2014-06-27 2015-12-31 David C. Haley, JR. Digital action in response to object interaction
US9959591B2 (en) * 2014-07-31 2018-05-01 Seiko Epson Corporation Display apparatus, method for controlling display apparatus, and program
CN104133593A (en) * 2014-08-06 2014-11-05 北京行云时空科技有限公司 Character input system and method based on motion sensing
CN104156082A (en) * 2014-08-06 2014-11-19 北京行云时空科技有限公司 Control system and intelligent terminal of user interfaces and applications aimed at space-time scenes
CN104197950B (en) * 2014-08-19 2018-02-16 奇瑞汽车股份有限公司 The method and system that geography information is shown
JP5989725B2 (en) * 2014-08-29 2016-09-07 京セラドキュメントソリューションズ株式会社 Electronic device and information display program
DE102014217843A1 (en) * 2014-09-05 2016-03-10 Martin Cudzilo Apparatus for facilitating the cleaning of surfaces and methods for detecting cleaning work done
US20160109701A1 (en) * 2014-10-15 2016-04-21 GM Global Technology Operations LLC Systems and methods for adjusting features within a head-up display
TWI613615B (en) * 2014-10-15 2018-02-01 在地實驗文化事業有限公司 Guiding system and method
US10108256B2 (en) 2014-10-30 2018-10-23 Mediatek Inc. Systems and methods for processing incoming events while performing a virtual reality session
CN107077214A (en) * 2014-11-06 2017-08-18 皇家飞利浦有限公司 For the method and system of the communication used within the hospital
CN104537401B (en) * 2014-12-19 2017-05-17 南京大学 Reality augmentation system and working method based on technologies of radio frequency identification and depth of field sensor
US9600076B2 (en) * 2014-12-19 2017-03-21 Immersion Corporation Systems and methods for object manipulation with haptic feedback
US20160196693A1 (en) * 2015-01-06 2016-07-07 Seiko Epson Corporation Display system, control method for display device, and computer program
TWI619041B (en) * 2015-01-09 2018-03-21 Chunghwa Telecom Co Ltd Augmented reality unlocking system and method
CN104570354A (en) * 2015-01-09 2015-04-29 京东方科技集团股份有限公司 Interactive glasses and visitor guide system
US10317215B2 (en) 2015-01-09 2019-06-11 Boe Technology Group Co., Ltd. Interactive glasses and navigation system
JP2016133541A (en) * 2015-01-16 2016-07-25 株式会社ブリリアントサービス Electronic spectacle and method for controlling the same
US10018844B2 (en) 2015-02-09 2018-07-10 Microsoft Technology Licensing, Llc Wearable image display system
US10317677B2 (en) 2015-02-09 2019-06-11 Microsoft Technology Licensing, Llc Display system
CN107407977B (en) * 2015-03-05 2020-12-08 索尼公司 Information processing apparatus, control method, and program
JP6596883B2 (en) 2015-03-31 2019-10-30 ソニー株式会社 Head mounted display, head mounted display control method, and computer program
US20160292920A1 (en) * 2015-04-01 2016-10-06 Caterpillar Inc. Time-Shift Controlled Visualization of Worksite Operations
US10156908B2 (en) * 2015-04-15 2018-12-18 Sony Interactive Entertainment Inc. Pinch and hold gesture navigation on a head-mounted display
CN105138763A (en) * 2015-08-19 2015-12-09 中山大学 Method for real scene and reality information superposition in augmented reality
CN112557677A (en) * 2015-08-25 2021-03-26 株式会社日立高新技术 Automatic analyzer and sample testing automation system
CN105205454A (en) * 2015-08-27 2015-12-30 深圳市国华识别科技开发有限公司 System and method for capturing target object automatically
KR102456597B1 (en) * 2015-09-01 2022-10-20 삼성전자주식회사 Electronic apparatus and operating method thereof
KR101708455B1 (en) * 2015-09-08 2017-02-21 엠더블유엔테크 주식회사 Hand Float Menu System
CN105183173B (en) * 2015-10-12 2018-08-28 重庆中电大宇卫星应用技术研究所 It is a kind of by tactics and Morse code gesture input and the device for being converted to voice
DE102015221860A1 (en) * 2015-11-06 2017-05-11 BSH Hausgeräte GmbH System and method for facilitating operation of a household appliance
CN105872815A (en) * 2015-11-25 2016-08-17 乐视网信息技术(北京)股份有限公司 Video playing method and device
JP2017129406A (en) * 2016-01-19 2017-07-27 日本電気通信システム株式会社 Information processing device, smart glass and control method thereof, and computer program
SE541141C2 (en) * 2016-04-18 2019-04-16 Moonlightning Ind Ab Focus pulling with a stereo vision camera system
CN105915715A (en) * 2016-05-25 2016-08-31 努比亚技术有限公司 Incoming call reminding method and device thereof, wearable audio device and mobile terminal
WO2017217752A1 (en) * 2016-06-17 2017-12-21 이철윤 System and method for generating three dimensional composite image of product and packing box
CN106155315A (en) * 2016-06-28 2016-11-23 广东欧珀移动通信有限公司 The adding method of augmented reality effect, device and mobile terminal in a kind of shooting
CN106157363A (en) * 2016-06-28 2016-11-23 广东欧珀移动通信有限公司 A kind of photographic method based on augmented reality, device and mobile terminal
CN106125932A (en) * 2016-06-28 2016-11-16 广东欧珀移动通信有限公司 The recognition methods of destination object, device and mobile terminal in a kind of augmented reality
CN106066701B (en) * 2016-07-05 2019-07-26 上海智旭商务咨询有限公司 A kind of AR and VR data processing equipment and method
KR20180009170A (en) * 2016-07-18 2018-01-26 엘지전자 주식회사 Mobile terminal and operating method thereof
CN106354257A (en) * 2016-08-30 2017-01-25 湖北睛彩视讯科技有限公司 Mobile scene fusion system and method based on augmented reality technology
CN106980362A (en) * 2016-10-09 2017-07-25 阿里巴巴集团控股有限公司 Input method and device based on virtual reality scenario
JP2018082363A (en) * 2016-11-18 2018-05-24 セイコーエプソン株式会社 Head-mounted display device and method for controlling the same, and computer program
US10996814B2 (en) 2016-11-29 2021-05-04 Real View Imaging Ltd. Tactile feedback in a display system
CN108885533B (en) * 2016-12-21 2021-05-07 杰创科科技有限公司 Combining virtual reality and augmented reality
USD864959S1 (en) 2017-01-04 2019-10-29 Mentor Acquisition One, Llc Computer glasses
WO2018162985A1 (en) * 2017-03-10 2018-09-13 Zyetric Augmented Reality Limited Interactive augmented reality
CN110582741B (en) * 2017-03-21 2024-04-02 交互数字Vc控股公司 Method and system for haptic interaction detection and augmentation in augmented reality
US10489651B2 (en) * 2017-04-14 2019-11-26 Microsoft Technology Licensing, Llc Identifying a position of a marker in an environment
US11054894B2 (en) 2017-05-05 2021-07-06 Microsoft Technology Licensing, Llc Integrated mixed-input system
US20210117680A1 (en) * 2017-05-10 2021-04-22 Humane, Inc. Wearable multimedia device and cloud computing platform with laser projection system
US10895966B2 (en) 2017-06-30 2021-01-19 Microsoft Technology Licensing, Llc Selection using a multi-device mixed interactivity system
US11023109B2 (en) 2017-06-30 2021-06-01 Microsoft Techniogy Licensing, LLC Annotation using a multi-device mixed interactivity system
CN107340871A (en) * 2017-07-25 2017-11-10 深识全球创新科技(北京)有限公司 The devices and methods therefor and purposes of integrated gesture identification and ultrasonic wave touch feedback
WO2019021447A1 (en) * 2017-07-28 2019-01-31 株式会社オプティム Wearable terminal display system, wearable terminal display method and program
WO2019021446A1 (en) * 2017-07-28 2019-01-31 株式会社オプティム Wearable terminal display system, wearable terminal display method and program
CN107635057A (en) * 2017-07-31 2018-01-26 努比亚技术有限公司 A kind of virtual reality terminal control method, terminal and computer-readable recording medium
US10506217B2 (en) * 2017-10-09 2019-12-10 Facebook Technologies, Llc Head-mounted display tracking system
US20190129607A1 (en) * 2017-11-02 2019-05-02 Samsung Electronics Co., Ltd. Method and device for performing remote control
US10739861B2 (en) * 2018-01-10 2020-08-11 Facebook Technologies, Llc Long distance interaction with artificial reality objects using a near eye display interface
US10706628B2 (en) * 2018-02-28 2020-07-07 Lenovo (Singapore) Pte. Ltd. Content transfer
US20190324549A1 (en) * 2018-04-20 2019-10-24 Immersion Corporation Systems, devices, and methods for providing immersive reality interface modes
JP7056423B2 (en) * 2018-07-10 2022-04-19 オムロン株式会社 Input device
US11360558B2 (en) * 2018-07-17 2022-06-14 Apple Inc. Computer systems with finger devices
US10770035B2 (en) 2018-08-22 2020-09-08 Google Llc Smartphone-based radar system for facilitating awareness of user presence and orientation
US10890653B2 (en) 2018-08-22 2021-01-12 Google Llc Radar-based gesture enhancement for voice interfaces
US10698603B2 (en) * 2018-08-24 2020-06-30 Google Llc Smartphone-based radar system facilitating ease and accuracy of user interactions with displayed objects in an augmented-reality interface
CN109348003A (en) * 2018-09-17 2019-02-15 深圳市泰衡诺科技有限公司 Application control method and device
CN110942518B (en) * 2018-09-24 2024-03-29 苹果公司 Contextual Computer Generated Reality (CGR) digital assistant
KR102620702B1 (en) * 2018-10-12 2024-01-04 삼성전자주식회사 A mobile apparatus and a method for controlling the mobile apparatus
US10788880B2 (en) 2018-10-22 2020-09-29 Google Llc Smartphone-based radar system for determining user intention in a lower-power mode
CN109782639A (en) * 2018-12-29 2019-05-21 深圳市中孚能电气设备有限公司 The control method and control device of a kind of electronic equipment operating mode
JP7223776B2 (en) * 2019-01-24 2023-02-16 マクセル株式会社 Display terminal, application control system and application control method
JP6720385B1 (en) * 2019-02-07 2020-07-08 株式会社メルカリ Program, information processing method, and information processing terminal
CN110109547A (en) * 2019-05-05 2019-08-09 芋头科技(杭州)有限公司 Order Activiation method and system based on gesture identification
JP7331462B2 (en) * 2019-05-24 2023-08-23 京セラドキュメントソリューションズ株式会社 ROBOT SYSTEM, ROBOT CONTROL METHOD AND ELECTRONIC DEVICE
US10747371B1 (en) 2019-06-28 2020-08-18 Konica Minolta Business Solutions U.S.A., Inc. Detection of finger press from live video stream
CN113012214A (en) * 2019-12-20 2021-06-22 北京外号信息技术有限公司 Method and electronic device for setting spatial position of virtual object
JP2022022568A (en) * 2020-06-26 2022-02-07 沖電気工業株式会社 Display operation unit and device
CN113190110A (en) * 2021-03-30 2021-07-30 青岛小鸟看看科技有限公司 Interface element control method and device of head-mounted display equipment
KR20230164185A (en) * 2021-04-08 2023-12-01 스냅 인코포레이티드 Bimanual interactions between mapped hand regions for controlling virtual and graphical elements
CN113141529B (en) * 2021-04-25 2022-02-25 聚好看科技股份有限公司 Display device and media asset playing method
KR102629771B1 (en) * 2021-09-30 2024-01-29 박두고 Wearable device for recognition object using hand or finger tracking
US20230107590A1 (en) * 2021-10-01 2023-04-06 At&T Intellectual Property I, L.P. Augmented reality visualization of enclosed spaces
US20230315208A1 (en) * 2022-04-04 2023-10-05 Snap Inc. Gesture-based application invocation
CN115309271B (en) * 2022-09-29 2023-03-21 南方科技大学 Information display method, device and equipment based on mixed reality and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020060648A1 (en) * 2000-11-17 2002-05-23 Taichi Matsui Image-display control apparatus
US20060101349A1 (en) * 2000-05-29 2006-05-11 Klony Lieberman Virtual data entry device and method for input of alphanumeric and other data
US20100177035A1 (en) * 2008-10-10 2010-07-15 Schowengerdt Brian T Mobile Computing Device With A Virtual Keyboard
US20110029185A1 (en) * 2008-03-19 2011-02-03 Denso Corporation Vehicular manipulation input apparatus
US20110221669A1 (en) * 2010-02-28 2011-09-15 Osterhout Group, Inc. Gesture control in an augmented reality eyepiece
US20120032874A1 (en) * 2010-08-09 2012-02-09 Sony Corporation Display apparatus assembly
US20130050069A1 (en) * 2011-08-23 2013-02-28 Sony Corporation, A Japanese Corporation Method and system for use in providing three dimensional user interface

Family Cites Families (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0981309A (en) * 1995-09-13 1997-03-28 Toshiba Corp Input device
JP3365246B2 (en) 1997-03-14 2003-01-08 ミノルタ株式会社 Electronic still camera
JP3225882B2 (en) * 1997-03-27 2001-11-05 日本電信電話株式会社 Landscape labeling system
DE19917660A1 (en) * 1999-04-19 2000-11-02 Deutsch Zentr Luft & Raumfahrt Method and input device for controlling the position of an object to be graphically represented in a virtual reality
WO2001019365A1 (en) 1999-09-15 2001-03-22 Roche Consumer Health Ag Pharmaceutical and/or cosmetical compositions
US6771294B1 (en) * 1999-12-29 2004-08-03 Petri Pulli User interface
SE0000850D0 (en) * 2000-03-13 2000-03-13 Pink Solution Ab Recognition arrangement
US7215322B2 (en) 2001-05-31 2007-05-08 Siemens Corporate Research, Inc. Input devices for augmented reality applications
US7126558B1 (en) 2001-10-19 2006-10-24 Accenture Global Services Gmbh Industrial augmented reality
WO2003071410A2 (en) * 2002-02-15 2003-08-28 Canesta, Inc. Gesture recognition system using depth perceptive sensors
US6943754B2 (en) * 2002-09-27 2005-09-13 The Boeing Company Gaze tracking system, eye-tracking assembly and an associated method of calibration
US7676079B2 (en) * 2003-09-30 2010-03-09 Canon Kabushiki Kaisha Index identification method and apparatus
IL161002A0 (en) 2004-03-22 2004-08-31 Itay Katz Virtual video keyboard system
CN101375598A (en) * 2004-06-01 2009-02-25 L-3通信公司 Video flashlight/vision alert
US20060200662A1 (en) * 2005-02-01 2006-09-07 Microsoft Corporation Referencing objects in a virtual environment
WO2007011306A2 (en) * 2005-07-20 2007-01-25 Bracco Imaging S.P.A. A method of and apparatus for mapping a virtual model of an object to the object
KR100687737B1 (en) * 2005-03-19 2007-02-27 한국전자통신연구원 Apparatus and method for a virtual mouse based on two-hands gesture
WO2007033354A2 (en) * 2005-09-13 2007-03-22 Spacetime3D, Inc. System and method for providing three-dimensional graphical user interface
JP4851771B2 (en) * 2005-10-24 2012-01-11 京セラ株式会社 Information processing system and portable information terminal
US7509588B2 (en) * 2005-12-30 2009-03-24 Apple Inc. Portable electronic device with interface reconfiguration mode
US7725547B2 (en) * 2006-09-06 2010-05-25 International Business Machines Corporation Informing a user of gestures made by others out of the user's line of sight
JP4961914B2 (en) * 2006-09-08 2012-06-27 ソニー株式会社 Imaging display device and imaging display method
JP5228307B2 (en) * 2006-10-16 2013-07-03 ソニー株式会社 Display device and display method
JP2010512693A (en) * 2006-12-07 2010-04-22 アダックス,インク. System and method for data addition, recording and communication
KR101285360B1 (en) * 2007-01-25 2013-07-11 삼성전자주식회사 Point of interest displaying apparatus and method for using augmented reality
US8942764B2 (en) * 2007-10-01 2015-01-27 Apple Inc. Personal media device controlled via user initiated movements utilizing movement based interfaces
JP4933406B2 (en) * 2007-11-15 2012-05-16 キヤノン株式会社 Image processing apparatus and image processing method
US8165345B2 (en) * 2007-12-07 2012-04-24 Tom Chau Method, system, and computer program for detecting and characterizing motion
JP5250834B2 (en) * 2008-04-03 2013-07-31 コニカミノルタ株式会社 Head-mounted image display device
WO2009128064A2 (en) * 2008-04-14 2009-10-22 Pointgrab Ltd. Vision based pointing device emulation
US8971565B2 (en) 2008-05-29 2015-03-03 Hie-D Technologies, Llc Human interface electronic device
US8397181B2 (en) 2008-11-17 2013-03-12 Honeywell International Inc. Method and apparatus for marking a position of a real world object in a see-through display
US9041660B2 (en) * 2008-12-09 2015-05-26 Microsoft Technology Licensing, Llc Soft keyboard control
EP2391972B1 (en) 2009-02-02 2015-05-27 Eyesight Mobile Technologies Ltd. System and method for object recognition and tracking in a video stream
JP5620928B2 (en) * 2009-02-20 2014-11-05 コーニンクレッカ フィリップス エヌ ヴェ System, method and apparatus for placing apparatus in active mode
US8009022B2 (en) 2009-05-29 2011-08-30 Microsoft Corporation Systems and methods for immersive interaction with virtual objects
KR101622196B1 (en) * 2009-09-07 2016-05-18 삼성전자주식회사 Apparatus and method for providing poi information in portable terminal
US20110107216A1 (en) * 2009-11-03 2011-05-05 Qualcomm Incorporated Gesture-based user interface
JP4679661B1 (en) * 2009-12-15 2011-04-27 株式会社東芝 Information presenting apparatus, information presenting method, and program
KR20110075250A (en) 2009-12-28 2011-07-06 엘지전자 주식회사 Method and apparatus for tracking an object using a tracking mode
US20110213664A1 (en) * 2010-02-28 2011-09-01 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece
US9128281B2 (en) * 2010-09-14 2015-09-08 Microsoft Technology Licensing, Llc Eyepiece with uniformly illuminated reflective display
US20110214082A1 (en) * 2010-02-28 2011-09-01 Osterhout Group, Inc. Projection triggering through an external marker in an augmented reality eyepiece
US8788197B2 (en) 2010-04-30 2014-07-22 Ryan Fink Visual training devices, systems, and methods
US8593375B2 (en) 2010-07-23 2013-11-26 Gregory A Maltz Eye gaze user interface and method
US20120062602A1 (en) * 2010-09-13 2012-03-15 Nokia Corporation Method and apparatus for rendering a content display
US8941559B2 (en) 2010-09-21 2015-01-27 Microsoft Corporation Opacity filter for display device
US8768006B2 (en) * 2010-10-19 2014-07-01 Hewlett-Packard Development Company, L.P. Hand gesture recognition
CN102147926A (en) * 2011-01-17 2011-08-10 中兴通讯股份有限公司 Three-dimensional (3D) icon processing method and device and mobile terminal
US9336240B2 (en) * 2011-07-15 2016-05-10 Apple Inc. Geo-tagging digital images
KR20220032059A (en) * 2011-09-19 2022-03-15 아이사이트 모빌 테크놀로지 엘티디 Touch free interface for augmented reality systems
CN108469899B (en) 2012-03-13 2021-08-10 视力移动技术有限公司 Method of identifying an aiming point or area in a viewing space of a wearable display device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060101349A1 (en) * 2000-05-29 2006-05-11 Klony Lieberman Virtual data entry device and method for input of alphanumeric and other data
US20020060648A1 (en) * 2000-11-17 2002-05-23 Taichi Matsui Image-display control apparatus
US6822643B2 (en) * 2000-11-17 2004-11-23 Canon Kabushiki Kaisha Image-display control apparatus
US20110029185A1 (en) * 2008-03-19 2011-02-03 Denso Corporation Vehicular manipulation input apparatus
US20100177035A1 (en) * 2008-10-10 2010-07-15 Schowengerdt Brian T Mobile Computing Device With A Virtual Keyboard
US20110221669A1 (en) * 2010-02-28 2011-09-15 Osterhout Group, Inc. Gesture control in an augmented reality eyepiece
US20120032874A1 (en) * 2010-08-09 2012-02-09 Sony Corporation Display apparatus assembly
US20130050069A1 (en) * 2011-08-23 2013-02-28 Sony Corporation, A Japanese Corporation Method and system for use in providing three dimensional user interface

Cited By (189)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140358002A1 (en) * 2011-12-23 2014-12-04 Koninklijke Philips N.V. Method and apparatus for interactive display of three dimensional ultrasound images
US10966684B2 (en) * 2011-12-23 2021-04-06 Koninklijke Philips N.V Method and apparatus for interactive display of three dimensional ultrasound images
US20140028716A1 (en) * 2012-07-30 2014-01-30 Mitac International Corp. Method and electronic device for generating an instruction in an augmented reality environment
US9836128B2 (en) * 2012-11-02 2017-12-05 Samsung Electronics Co., Ltd. Method and device for providing information regarding an object
US20140125580A1 (en) * 2012-11-02 2014-05-08 Samsung Electronics Co., Ltd. Method and device for providing information regarding an object
US10646153B1 (en) 2013-01-19 2020-05-12 Bertec Corporation Force measurement system
US11052288B1 (en) 2013-01-19 2021-07-06 Bertec Corporation Force measurement system
US11857331B1 (en) 2013-01-19 2024-01-02 Bertec Corporation Force measurement system
US10413230B1 (en) 2013-01-19 2019-09-17 Bertec Corporation Force measurement system
US11311209B1 (en) 2013-01-19 2022-04-26 Bertec Corporation Force measurement system and a motion base used therein
US9770203B1 (en) 2013-01-19 2017-09-26 Bertec Corporation Force measurement system and a method of testing a subject
US10231662B1 (en) 2013-01-19 2019-03-19 Bertec Corporation Force measurement system
US9526443B1 (en) * 2013-01-19 2016-12-27 Bertec Corporation Force and/or motion measurement system and a method of testing a subject
US10010286B1 (en) 2013-01-19 2018-07-03 Bertec Corporation Force measurement system
US11540744B1 (en) 2013-01-19 2023-01-03 Bertec Corporation Force measurement system
US10856796B1 (en) 2013-01-19 2020-12-08 Bertec Corporation Force measurement system
US20140225918A1 (en) * 2013-02-14 2014-08-14 Qualcomm Incorporated Human-body-gesture-based region and volume selection for hmd
US10133342B2 (en) * 2013-02-14 2018-11-20 Qualcomm Incorporated Human-body-gesture-based region and volume selection for HMD
US11262835B2 (en) * 2013-02-14 2022-03-01 Qualcomm Incorporated Human-body-gesture-based region and volume selection for HMD
US9965896B2 (en) * 2013-02-21 2018-05-08 Fujitsu Limited Display device and display method
US20150356789A1 (en) * 2013-02-21 2015-12-10 Fujitsu Limited Display device and display method
US20140240226A1 (en) * 2013-02-27 2014-08-28 Robert Bosch Gmbh User Interface Apparatus
US9122916B2 (en) * 2013-03-14 2015-09-01 Honda Motor Co., Ltd. Three dimensional fingertip tracking
US20140270352A1 (en) * 2013-03-14 2014-09-18 Honda Motor Co., Ltd. Three dimensional fingertip tracking
US20140285520A1 (en) * 2013-03-22 2014-09-25 Industry-University Cooperation Foundation Hanyang University Wearable display device using augmented reality
US9507426B2 (en) * 2013-03-27 2016-11-29 Google Inc. Using the Z-axis in user interfaces for head mountable displays
US9213403B1 (en) 2013-03-27 2015-12-15 Google Inc. Methods to pan, zoom, crop, and proportionally move on a head mountable display
US9811154B2 (en) 2013-03-27 2017-11-07 Google Inc. Methods to pan, zoom, crop, and proportionally move on a head mountable display
US20140306881A1 (en) * 2013-04-15 2014-10-16 Olympus Corporation Wearable device, program and display controlling method of wearable device
US20160103437A1 (en) * 2013-06-27 2016-04-14 Abb Technology Ltd Method and data presenting device for assisting a remote user to provide instructions
US9829873B2 (en) * 2013-06-27 2017-11-28 Abb Schweiz Ag Method and data presenting device for assisting a remote user to provide instructions
US10866093B2 (en) 2013-07-12 2020-12-15 Magic Leap, Inc. Method and system for retrieving data in response to user input
US10352693B2 (en) 2013-07-12 2019-07-16 Magic Leap, Inc. Method and system for obtaining texture data of a space
US11656677B2 (en) 2013-07-12 2023-05-23 Magic Leap, Inc. Planar waveguide apparatus with diffraction element(s) and system employing same
US10533850B2 (en) 2013-07-12 2020-01-14 Magic Leap, Inc. Method and system for inserting recognized object data into a virtual world
US10295338B2 (en) 2013-07-12 2019-05-21 Magic Leap, Inc. Method and system for generating map data from an image
US10495453B2 (en) 2013-07-12 2019-12-03 Magic Leap, Inc. Augmented reality system totems and methods of using same
US11029147B2 (en) 2013-07-12 2021-06-08 Magic Leap, Inc. Method and system for facilitating surgery using an augmented reality system
US20150243105A1 (en) * 2013-07-12 2015-08-27 Magic Leap, Inc. Method and system for interacting with user interfaces
US10571263B2 (en) 2013-07-12 2020-02-25 Magic Leap, Inc. User and object interaction with an augmented reality scenario
US10473459B2 (en) 2013-07-12 2019-11-12 Magic Leap, Inc. Method and system for determining user input based on totem
US11060858B2 (en) 2013-07-12 2021-07-13 Magic Leap, Inc. Method and system for generating a virtual user interface related to a totem
US10591286B2 (en) 2013-07-12 2020-03-17 Magic Leap, Inc. Method and system for generating virtual rooms
US10767986B2 (en) * 2013-07-12 2020-09-08 Magic Leap, Inc. Method and system for interacting with user interfaces
US11221213B2 (en) 2013-07-12 2022-01-11 Magic Leap, Inc. Method and system for generating a retail experience using an augmented reality system
US10641603B2 (en) 2013-07-12 2020-05-05 Magic Leap, Inc. Method and system for updating a virtual world
US10408613B2 (en) 2013-07-12 2019-09-10 Magic Leap, Inc. Method and system for rendering virtual content
US11921471B2 (en) 2013-08-16 2024-03-05 Meta Platforms Technologies, Llc Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source
US11644799B2 (en) 2013-10-04 2023-05-09 Meta Platforms Technologies, Llc Systems, articles and methods for wearable electronic devices employing contact sensors
US11079846B2 (en) 2013-11-12 2021-08-03 Facebook Technologies, Llc Systems, articles, and methods for capacitive electromyography sensors
US9740935B2 (en) * 2013-11-26 2017-08-22 Honeywell International Inc. Maintenance assistant system
US20150146007A1 (en) * 2013-11-26 2015-05-28 Honeywell International Inc. Maintenance assistant system
US11666264B1 (en) 2013-11-27 2023-06-06 Meta Platforms Technologies, Llc Systems, articles, and methods for electromyography sensors
US20150271396A1 (en) * 2014-03-24 2015-09-24 Samsung Electronics Co., Ltd. Electronic device and method for image data processing
US9560272B2 (en) * 2014-03-24 2017-01-31 Samsung Electronics Co., Ltd. Electronic device and method for image data processing
US10602200B2 (en) 2014-05-28 2020-03-24 Lucasfilm Entertainment Company Ltd. Switching modes of a media content item
US11508125B1 (en) 2014-05-28 2022-11-22 Lucasfilm Entertainment Company Ltd. Navigating a virtual environment of a media content item
US10600245B1 (en) * 2014-05-28 2020-03-24 Lucasfilm Entertainment Company Ltd. Navigating a virtual environment of a media content item
US10484673B2 (en) * 2014-06-05 2019-11-19 Samsung Electronics Co., Ltd. Wearable device and method for providing augmented reality information
US20150358614A1 (en) * 2014-06-05 2015-12-10 Samsung Electronics Co., Ltd. Wearable device and method for providing augmented reality information
US9886848B2 (en) * 2014-06-12 2018-02-06 Lg Electronics Inc. Mobile terminal and control system
US20150364037A1 (en) * 2014-06-12 2015-12-17 Lg Electronics Inc. Mobile terminal and control system
US20160048024A1 (en) * 2014-08-13 2016-02-18 Beijing Lenovo Software Ltd. Information processing method and electronic device
US9696551B2 (en) * 2014-08-13 2017-07-04 Beijing Lenovo Software Ltd. Information processing method and electronic device
US10241568B2 (en) * 2014-08-18 2019-03-26 Universal City Studios Llc Systems and methods for generating augmented and virtual reality images
US20160048203A1 (en) * 2014-08-18 2016-02-18 Universal City Studios Llc Systems and methods for generating augmented and virtual reality images
US20190196581A1 (en) * 2014-08-18 2019-06-27 Universal City Studios Llc Systems and methods for generating augmented and virtual reality images
US9690375B2 (en) * 2014-08-18 2017-06-27 Universal City Studios Llc Systems and methods for generating augmented and virtual reality images
US10606348B2 (en) * 2014-08-18 2020-03-31 Universal City Studios Llc Systems and methods for generating augmented and virtual reality images
US11586277B2 (en) 2014-08-18 2023-02-21 Universal City Studios Llc Systems and methods for generating augmented and virtual reality images
US20160054802A1 (en) * 2014-08-21 2016-02-25 Samsung Electronics Co., Ltd. Sensor based ui in hmd incorporating light turning element
US9910504B2 (en) * 2014-08-21 2018-03-06 Samsung Electronics Co., Ltd. Sensor based UI in HMD incorporating light turning element
US11768540B2 (en) 2014-09-09 2023-09-26 Ultrahaptics Ip Ltd Method and apparatus for modulating haptic feedback
US11656686B2 (en) * 2014-09-09 2023-05-23 Ultrahaptics Ip Ltd Method and apparatus for modulating haptic feedback
US10684367B2 (en) 2014-11-26 2020-06-16 Samsung Electronics Co., Ltd. Ultrasound sensor and object detecting method thereof
US10310624B2 (en) * 2014-12-17 2019-06-04 Konica Minolta, Inc. Electronic apparatus, method for controlling electronic apparatus, and control program for the same
US10606359B2 (en) 2014-12-19 2020-03-31 Immersion Corporation Systems and methods for haptically-enabled interactions with objects
US9685005B2 (en) * 2015-01-02 2017-06-20 Eon Reality, Inc. Virtual lasers for interacting with augmented reality environments
US20170061700A1 (en) * 2015-02-13 2017-03-02 Julian Michael Urbach Intercommunication between a head mounted display and a real world object
US11830351B2 (en) 2015-02-20 2023-11-28 Ultrahaptics Ip Ltd Algorithm improvements in a haptic system
US20180225520A1 (en) * 2015-02-23 2018-08-09 Vivint, Inc. Techniques for identifying and indexing distinguishing features in a video feed
US10963701B2 (en) * 2015-02-23 2021-03-30 Vivint, Inc. Techniques for identifying and indexing distinguishing features in a video feed
JP2016161734A (en) * 2015-03-02 2016-09-05 セイコーエプソン株式会社 Display device, method for controlling display device, and program
US9983840B2 (en) * 2015-04-24 2018-05-29 Panasonic Intellectual Property Corporation Of America Head-mounted display apparatus worn on user's head
US20160313956A1 (en) * 2015-04-24 2016-10-27 Panasonic Intellectual Property Corporation Of America Head-mounted display apparatus worn on user's head
US10747488B2 (en) * 2015-04-24 2020-08-18 Panasonic Intellectual Property Corporation Of America Head-mounted display apparatus worn on user's head
US9946504B2 (en) * 2015-04-24 2018-04-17 Panasonic Intellectual Property Corporation Of America Head-mounted display apparatus worn on user's head
US20190317717A1 (en) * 2015-04-24 2019-10-17 Panasonic Intellectual Property Corporation Of America Head-mounted display apparatus worn on user's head
US20180039466A1 (en) * 2015-04-24 2018-02-08 Panasonic Intellectual Property Corporation Of America Head-mounted display apparatus worn on user's head
US10359982B2 (en) * 2015-04-24 2019-07-23 Panasonic Intellectual Property Corporation Of America Head-mounted display apparatus worn on user's head
US20180239573A1 (en) * 2015-04-24 2018-08-23 Panasonic Intellectual Property Corporation of Ame rica Head-mounted display apparatus worn on user's head
US10055888B2 (en) 2015-04-28 2018-08-21 Microsoft Technology Licensing, Llc Producing and consuming metadata within multi-dimensional data
WO2016206874A1 (en) * 2015-06-23 2016-12-29 Siemens Aktiengesellschaft Interaction system
US10409443B2 (en) * 2015-06-24 2019-09-10 Microsoft Technology Licensing, Llc Contextual cursor display based on hand tracking
US20160378294A1 (en) * 2015-06-24 2016-12-29 Shawn Crispin Wright Contextual cursor display based on hand tracking
US10156726B2 (en) * 2015-06-29 2018-12-18 Microsoft Technology Licensing, Llc Graphene in optical systems
CN108369640A (en) * 2015-12-17 2018-08-03 诺基亚技术有限公司 For control scene capture images image procossing to adjust the method, apparatus or computer program of capture images
US11587202B2 (en) * 2015-12-17 2023-02-21 Nokia Technologies Oy Method, apparatus or computer program for controlling image processing of a captured image of a scene to adapt the captured image
US10082940B2 (en) 2015-12-23 2018-09-25 Intel Corporation Text functions in augmented reality
US9697648B1 (en) 2015-12-23 2017-07-04 Intel Corporation Text functions in augmented reality
WO2017112099A1 (en) * 2015-12-23 2017-06-29 Intel Corporation Text functions in augmented reality
CN105843390A (en) * 2016-02-24 2016-08-10 上海理湃光晶技术有限公司 Method for image scaling and AR (Augmented Reality) glasses based on method
US10168768B1 (en) 2016-03-02 2019-01-01 Meta Company Systems and methods to facilitate interactions in an interactive space
US10133345B2 (en) 2016-03-22 2018-11-20 Microsoft Technology Licensing, Llc Virtual-reality navigation
US9933855B2 (en) 2016-03-31 2018-04-03 Intel Corporation Augmented reality in a field of view including a reflection
WO2017172211A1 (en) * 2016-03-31 2017-10-05 Intel Corporation Augmented reality in a field of view including a reflection
US10438419B2 (en) 2016-05-13 2019-10-08 Meta View, Inc. System and method for modifying virtual objects in a virtual environment in response to user interactions
US10186088B2 (en) 2016-05-13 2019-01-22 Meta Company System and method for managing interactive virtual frames for virtual objects in a virtual environment
US9990779B2 (en) * 2016-05-13 2018-06-05 Meta Company System and method for modifying virtual objects in a virtual environment in response to user interactions
ES2643863A1 (en) * 2016-05-24 2017-11-24 Sonovisión Ingenieros España, S.A.U. Method to provide through guided augmented reality,\rinspection and support in installation or maintenance of processes\rfor complex assemblies compatible with s1000d and device that makes use of the same (Machine-translation by Google Translate, not legally binding)
US10990174B2 (en) 2016-07-25 2021-04-27 Facebook Technologies, Llc Methods and apparatus for predicting musculo-skeletal position information using wearable autonomous sensors
US11337652B2 (en) 2016-07-25 2022-05-24 Facebook Technologies, Llc System and method for measuring the movements of articulated rigid bodies
US10656711B2 (en) 2016-07-25 2020-05-19 Facebook Technologies, Llc Methods and apparatus for inferring user intent based on neuromuscular signals
WO2018071019A1 (en) * 2016-10-13 2018-04-19 Ford Motor Company Dual-mode augmented reality interfaces for mobile devices
CN109804638A (en) * 2016-10-13 2019-05-24 福特汽车公司 The double mode augmented reality interface of mobile device
US11119585B2 (en) 2016-10-13 2021-09-14 Ford Motor Company Dual-mode augmented reality interfaces for mobile devices
US20190379925A1 (en) * 2016-10-26 2019-12-12 OrCam Technologies, Ltd. Wearable device and methods for transmitting information based on physical distance
US10887486B2 (en) * 2016-10-26 2021-01-05 Orcam Technologies, Ltd Wearable device and methods for transmitting information based on physical distance
US10405024B2 (en) * 2016-10-26 2019-09-03 Orcam Technologies Ltd. Wearable device and methods for transmitting information based on physical distance
US11409497B2 (en) 2016-12-23 2022-08-09 Realwear, Inc. Hands-free navigation of touch-based operating systems
US11947752B2 (en) 2016-12-23 2024-04-02 Realwear, Inc. Customizing user interfaces of binary applications
US11099716B2 (en) 2016-12-23 2021-08-24 Realwear, Inc. Context based content navigation for wearable display
US11507216B2 (en) * 2016-12-23 2022-11-22 Realwear, Inc. Customizing user interfaces of binary applications
US20180189474A1 (en) * 2016-12-30 2018-07-05 Baidu Online Network Technology (Beijing) Co., Ltd. Method and Electronic Device for Unlocking Electronic Device
US10620779B2 (en) * 2017-04-24 2020-04-14 Microsoft Technology Licensing, Llc Navigating a holographic image
US10481755B1 (en) * 2017-04-28 2019-11-19 Meta View, Inc. Systems and methods to present virtual content in an interactive space
US20230333378A1 (en) * 2017-08-25 2023-10-19 Snap Inc. Wristwatch based interface for augmented reality eyewear
US10068403B1 (en) 2017-09-21 2018-09-04 Universal City Studios Llc Locker management techniques
US10957135B2 (en) 2017-09-21 2021-03-23 Universal City Studios Llc Locker management techniques
US11635736B2 (en) 2017-10-19 2023-04-25 Meta Platforms Technologies, Llc Systems and methods for identifying biological structures associated with neuromuscular source signals
US10823964B2 (en) * 2017-11-02 2020-11-03 Olympus Corporation Work assistance apparatus, work assistance method, and computer-readable, non-transitory recording medium recording work assistance program executed by computer
US20190129179A1 (en) * 2017-11-02 2019-05-02 Olympus Corporation Work assistance apparatus, work assistance method, and computer-readable, non-transitory recording medium recording work assistance program executed by computer
US11921928B2 (en) 2017-11-26 2024-03-05 Ultrahaptics Ip Ltd Haptic effects from focused acoustic fields
US11321880B2 (en) 2017-12-22 2022-05-03 Sony Corporation Information processor, information processing method, and program for specifying an important region of an operation target in a moving image
US11069148B2 (en) 2018-01-25 2021-07-20 Facebook Technologies, Llc Visualization of reconstructed handstate information
US11883847B2 (en) 2018-05-02 2024-01-30 Ultraleap Limited Blocking plate structure for improved acoustic transmission efficiency
US20190339837A1 (en) * 2018-05-04 2019-11-07 Oculus Vr, Llc Copy and Paste in a Virtual Reality Environment
US11216069B2 (en) 2018-05-08 2022-01-04 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
US11036302B1 (en) 2018-05-08 2021-06-15 Facebook Technologies, Llc Wearable devices and methods for improved speech recognition
US10937414B2 (en) 2018-05-08 2021-03-02 Facebook Technologies, Llc Systems and methods for text input using neuromuscular information
US10768426B2 (en) 2018-05-21 2020-09-08 Microsoft Technology Licensing, Llc Head mounted display system receiving three-dimensional push notification
US10687759B2 (en) 2018-05-29 2020-06-23 Facebook Technologies, Llc Shielding techniques for noise reduction in surface electromyography signal measurement and related systems and methods
US11129569B1 (en) 2018-05-29 2021-09-28 Facebook Technologies, Llc Shielding techniques for noise reduction in surface electromyography signal measurement and related systems and methods
US10970374B2 (en) 2018-06-14 2021-04-06 Facebook Technologies, Llc User identification and authentication with neuromuscular signatures
US11045137B2 (en) 2018-07-19 2021-06-29 Facebook Technologies, Llc Methods and apparatus for improved signal robustness for a wearable neuromuscular recording device
US11179066B2 (en) 2018-08-13 2021-11-23 Facebook Technologies, Llc Real-time spike detection and identification
US10909762B2 (en) 2018-08-24 2021-02-02 Microsoft Technology Licensing, Llc Gestures for facilitating interaction with pages in a mixed reality environment
US10905350B2 (en) 2018-08-31 2021-02-02 Facebook Technologies, Llc Camera-guided interpretation of neuromuscular signals
US10842407B2 (en) 2018-08-31 2020-11-24 Facebook Technologies, Llc Camera-guided interpretation of neuromuscular signals
CN112789577A (en) * 2018-09-20 2021-05-11 脸谱科技有限责任公司 Neuromuscular text input, writing and drawing in augmented reality systems
US11567573B2 (en) * 2018-09-20 2023-01-31 Meta Platforms Technologies, Llc Neuromuscular text entry, writing and drawing in augmented reality systems
US10921764B2 (en) 2018-09-26 2021-02-16 Facebook Technologies, Llc Neuromuscular control of physical objects in an environment
US10970936B2 (en) 2018-10-05 2021-04-06 Facebook Technologies, Llc Use of neuromuscular signals to provide enhanced interactions with physical objects in an augmented reality environment
US10929099B2 (en) * 2018-11-02 2021-02-23 Bose Corporation Spatialized virtual personal assistant
US11797087B2 (en) 2018-11-27 2023-10-24 Meta Platforms Technologies, Llc Methods and apparatus for autocalibration of a wearable electrode sensor system
US11941176B1 (en) 2018-11-27 2024-03-26 Meta Platforms Technologies, Llc Methods and apparatus for autocalibration of a wearable electrode sensor system
US11804041B2 (en) 2018-12-04 2023-10-31 Apple Inc. Method, device, and system for generating affordances linked to a representation of an item
US10789952B2 (en) * 2018-12-20 2020-09-29 Microsoft Technology Licensing, Llc Voice command execution from auxiliary input
EP3847531A4 (en) * 2019-01-31 2021-11-03 Huawei Technologies Co., Ltd. Hand-over-face input sensing for interaction with a device having a built-in camera
US11393254B2 (en) 2019-01-31 2022-07-19 Huawei Technologies Co., Ltd. Hand-over-face input sensing for interaction with a device having a built-in camera
US10905383B2 (en) 2019-02-28 2021-02-02 Facebook Technologies, Llc Methods and apparatus for unsupervised one-shot machine learning for classification of human gestures and estimation of applied forces
US11481030B2 (en) 2019-03-29 2022-10-25 Meta Platforms Technologies, Llc Methods and apparatus for gesture detection and classification
US11842517B2 (en) 2019-04-12 2023-12-12 Ultrahaptics Ip Ltd Using iterative 3D-model fitting for domain adaptation of a hand-pose-estimation neural network
US11481031B1 (en) 2019-04-30 2022-10-25 Meta Platforms Technologies, Llc Devices, systems, and methods for controlling computing devices via neuromuscular signals of users
USD1009884S1 (en) * 2019-07-26 2024-01-02 Sony Corporation Mixed reality eyeglasses or portion thereof with an animated graphical user interface
US11354896B2 (en) 2019-07-31 2022-06-07 Seiko Epson Corporation Display device, display method, and computer program
US10909767B1 (en) * 2019-08-01 2021-02-02 International Business Machines Corporation Focal and interaction driven content replacement into augmented reality
US11493993B2 (en) 2019-09-04 2022-11-08 Meta Platforms Technologies, Llc Systems, methods, and interfaces for performing inputs based on neuromuscular control
US11747915B2 (en) 2019-09-30 2023-09-05 Snap Inc. Smart ring for manipulating virtual objects displayed by a wearable device
US20220113814A1 (en) 2019-09-30 2022-04-14 Yu Jiang Tham Smart ring for manipulating virtual objects displayed by a wearable device
US20230131474A1 (en) * 2019-10-16 2023-04-27 Samuel R. Pecota Augmented reality marine navigation
US20210158630A1 (en) * 2019-11-25 2021-05-27 Facebook Technologies, Llc Systems and methods for contextualized interactions with an environment
US11907423B2 (en) * 2019-11-25 2024-02-20 Meta Platforms Technologies, Llc Systems and methods for contextualized interactions with an environment
US11961494B1 (en) 2020-03-27 2024-04-16 Meta Platforms Technologies, Llc Electromagnetic interference reduction in extended reality environments
US11798429B1 (en) 2020-05-04 2023-10-24 Snap Inc. Virtual tutorials for musical instruments with finger tracking in augmented reality
US11520399B2 (en) 2020-05-26 2022-12-06 Snap Inc. Interactive augmented reality experiences using positional tracking
US11816267B2 (en) 2020-06-23 2023-11-14 Ultraleap Limited Features of airborne ultrasonic fields
US11886639B2 (en) 2020-09-17 2024-01-30 Ultraleap Limited Ultrahapticons
US11925863B2 (en) 2020-09-18 2024-03-12 Snap Inc. Tracking hand gestures for interactive game control in augmented reality
US20220103748A1 (en) * 2020-09-28 2022-03-31 Ilteris Canberk Touchless photo capture in response to detected hand gestures
US11546505B2 (en) * 2020-09-28 2023-01-03 Snap Inc. Touchless photo capture in response to detected hand gestures
US11644902B2 (en) * 2020-11-30 2023-05-09 Google Llc Gesture-based content transfer
US11740313B2 (en) 2020-12-30 2023-08-29 Snap Inc. Augmented reality precision tracking and display
US11531402B1 (en) 2021-02-25 2022-12-20 Snap Inc. Bimanual gestures for controlling virtual and graphical elements
US11868531B1 (en) 2021-04-08 2024-01-09 Meta Platforms Technologies, Llc Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof
US11861070B2 (en) 2021-04-19 2024-01-02 Snap Inc. Hand gestures for animating and controlling virtual and graphical elements
US20220350997A1 (en) * 2021-04-29 2022-11-03 Google Llc Pointer-based content recognition using a head-mounted device
US11435857B1 (en) * 2021-04-29 2022-09-06 Google Llc Content access and navigation using a head-mounted device
CN114089879A (en) * 2021-11-15 2022-02-25 北京灵犀微光科技有限公司 Cursor control method of augmented reality display equipment

Also Published As

Publication number Publication date
KR20140069124A (en) 2014-06-09
CN115167675A (en) 2022-10-11
US20170052599A1 (en) 2017-02-23
US20160320855A1 (en) 2016-11-03
US11093045B2 (en) 2021-08-17
CN103858073A (en) 2014-06-11
JP2021007022A (en) 2021-01-21
US20160259423A1 (en) 2016-09-08
CN103858073B (en) 2022-07-29
KR20220032059A (en) 2022-03-15
US20200097093A1 (en) 2020-03-26
JP2014531662A (en) 2014-11-27
JP7297216B2 (en) 2023-06-26
KR20190133080A (en) 2019-11-29
US20160306433A1 (en) 2016-10-20
US20160291699A1 (en) 2016-10-06
WO2013093906A1 (en) 2013-06-27
US11494000B2 (en) 2022-11-08
US20220107687A1 (en) 2022-04-07
JP2018028922A (en) 2018-02-22
US10401967B2 (en) 2019-09-03

Similar Documents

Publication Publication Date Title
US11494000B2 (en) Touch free interface for augmented reality systems
US11262835B2 (en) Human-body-gesture-based region and volume selection for HMD
US20210407203A1 (en) Augmented reality experiences using speech and text captions
US10120454B2 (en) Gesture recognition control device
CN105229582B (en) Gesture detection based on proximity sensor and image sensor
US9329678B2 (en) Augmented reality overlay for control devices
JP7092028B2 (en) Information processing equipment, information processing methods, and programs
US11755122B2 (en) Hand gesture-based emojis
US20120229509A1 (en) System and method for user interaction
US20200142495A1 (en) Gesture recognition control device
EP2886173A1 (en) Augmented reality overlay for control devices

Legal Events

Date Code Title Description
AS Assignment

Owner name: EYESIGHT MOBILE TECHNOLOGIES LTD., ISRAEL

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KATZ, ITAY;SHENFELD, AMNON;SIGNING DATES FROM 20140318 TO 20140319;REEL/FRAME:032482/0531

STCV Information on status: appeal procedure

Free format text: BOARD OF APPEALS DECISION RENDERED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION