WO2003003185A1 - System for establishing a user interface - Google Patents

System for establishing a user interface Download PDF

Info

Publication number
WO2003003185A1
WO2003003185A1 PCT/FI2002/000526 FI0200526W WO03003185A1 WO 2003003185 A1 WO2003003185 A1 WO 2003003185A1 FI 0200526 W FI0200526 W FI 0200526W WO 03003185 A1 WO03003185 A1 WO 03003185A1
Authority
WO
WIPO (PCT)
Prior art keywords
user interface
pointing
intended
interface device
tracking
Prior art date
Application number
PCT/FI2002/000526
Other languages
French (fr)
Inventor
Ismo Rakkolainen
Original Assignee
Ismo Rakkolainen
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from FI20011321A external-priority patent/FI20011321A0/en
Application filed by Ismo Rakkolainen filed Critical Ismo Rakkolainen
Publication of WO2003003185A1 publication Critical patent/WO2003003185A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Definitions

  • the present invention relates to a system according to the preamble of the appended claim 1 for establishing a user interface.
  • a good user interface is as intuitive as possible, it blends in to the nor- mal life, and it is not always necessary to separately learn to use the user interface.
  • the most well-known user interface is the keyboard and mouse of a computer. Thus the user's eyes are on the display in which a cursor moved by the mouse moves on a two-dimensional surface.
  • GUI graphical user interface
  • WIMP Windows-Icon-Menu-Pointer
  • Other electric devices also contain a user interface that is typically attached to the device or it is separate remote controller.
  • the research of virtual reality and multi- modal user interfaces has presented many human-oriented and sense- oriented ways of communicating with a computer. For example several senses and other hints are utilized therein.
  • the methods include for example speech and eye recognition, recognition of a hand, expression or gesture that is based on computer vision, or touch feedback.
  • the devices to be used include for example a mouse, a display, a touch screen, a pen, a light pen, a data glove, a pedal, a virtual visor, a force feedback device, or another multimodal device.
  • One known wireless user interface device is Xeroc ® PARCTAB that is a palmtop computer equipped with a touch screen and that communicates wirelessly with the applications of the work station by means of an infrared link. It is a purpose of the device to function is a user interface device. In a so-called Ubiquitous Computing principle the wireless connection to the devices in the environment is of great importance.
  • There is also a known inViso® eCase PDA device that is equipped with a micro display in which information is presented in a manner similar to a normal-sized display. The device has a touch sensitive surface that functions as a mouse, as well as control buttons, and by means of aux- iliary devices it can be used for wireless data retrieval and communication for example via e-mail.
  • AR and VR virtual reality
  • virtual visors are display devices to be worn on the head, by means of which an image of the surrounding world is created in virtual reality.
  • the virtual visor and the data glove are tracked with special tracking devices that function electromechanically, electromagnetically, acoustically or optically.
  • Augmented reality utilizes semi-permeable displays, wherein the real view is complemented with artificial objects that are aligned on top of the view.
  • So-called wearable computing principle is related to augmented reality, and therein the display and the computer move along with and can be worn by the user.
  • the devices have been developed for example in the MIThrill project (MIT Media Lab, USA) and in the HIT Lab laboratory (Human Interface Technology Lab, University of Washington, USA).
  • the system that typically utilizes a wirelessly operating apparatus, a small display attached to glasses or a covering display (HMD, Head Mounted Display), as well as a separate hand-held control device that resembles a palmtop microcomputer or is a keyboard equipped with a cable. It is also possible to use head mounted cameras and new kinds of I/O devices (Input/Output) such as video analysis in which a finger is used as a pointer device.
  • I/O devices Input/Output
  • MARISIL Proliferative Reduction Framework for Future Mobile MediaPhone, Proceedings of the 2nd International Symposium on Mobile Multimedia Systems & Applications (MMSA 2000), pp. 1 - 5, Delft
  • Virtual keys are projected on the palm by means of the display of a semi-permeable visor, wherein the virtual display is directed on the palm of the user, and a video camera is used for recognition of the movements of the other hand that make selections.
  • the pointer device that makes selections may also be equipped with a magnetic tracking device.
  • a multimodal input device is based on expressions, gestures, posture of the body, or speech, wherein devices held by or installed on the user are not utilized.
  • the user can for example be followed by means of cameras, and for example a pointing by a hand can be detected for example by means of computer vision or video tracking by infrared light.
  • One known implementation is also a finger functioning as a mouse that is analysed by means of a video image (Takeshi Kurata, Takashi Okuma, Masakatsu Kourogi, & Katsuhiko Sakaue: The Hand-mouse: A Human Interface Suitable for Augmented Reality Environments Enabled by Visual Wearables, In Proc. ISMR 2001 in Yokohama, Japan, pp. 188-189).
  • US patent publication 6091546 discloses an apparatus in the form of glasses for a user interface that contains for example a display, a video camera, a microphone and other apparatuses.
  • US patent publication 6064354 discloses a method and apparatus by means of which a three-dimensional impression and user interface are produced by means of a screen. The pointer is for example a finger that is tracked by means of video cameras and a suitable video analysis and an algorithm.
  • US patent publication 6204852 two different cameras and a video analysis are used for determining the posture and location of the user's hand, and this is used as an input for the computer.
  • a computer apparatus that utilizes video analysis is also disclosed in US patent publication 5168531. Systems are used in connection with different kinds of computer apparatuses and displays, wherein an alternative pointing method is attained by means of the video analysis.
  • the user interface device can be easily utilized in operation with numerous different devices.
  • the invention relates to a system containing a small user interface device that moves with the user and that is intended to be hand-held and preferably also carried along, and that forms the user interface of another device, preferably in a wireless manner. It can function as a general remote controller for different kinds of electronic devices, or it can function as an assistive device in retrieving and viewing information, thus operating for example as a browser.
  • the device to be operated can be located close by, or the operation may occur from a distance.
  • the user interface device can also function as a built-in or integrated part of another device.
  • the pointer is a suitable apparatus, arrangement or for example merely a finger, wherein the device preferably functions in such a manner that separate receivers, trans- mitters, sensors, light sources, reflectors or other identifiers are not necessary in the hand that operates as a pointer.
  • a virtual, intuitive user interface is produced by means of a small device.
  • a tracking system is integrated in the apparatus for the purpose of pointing and for the interpretation of the same, wherein the basic idea is to use for example a camera and an ocular display in such a manner that for example the act of pointing with a finger is intuitive with respect to the view of the display.
  • the finger is tracked for example by means of video analysis.
  • the device contains a wired or wireless connection to an external, typically sepa- rate device for which said device operates as a user interface device, the user interface of said device being presented on the ocular display of the device.
  • the device may also be integrated for example in a remote controller or a mobile phone.
  • the device to be controlled comprises means which are utilized to transfer the user interface into the user interface device, whose control system and data transmission means are arranged for this purpose.
  • the control system of the device to be controlled is arranged to communicate in a suitable manner with the user interface device and to control its function on the basis of the commands of the same.
  • the device to be controlled may also entirely lack physical devices, for example a keyboard or a display for direct control or use.
  • the device to be controlled may be for example an independent device without telecommunication connections, or a computer apparatus.
  • the user interface device comprises one or more ocular display devices.
  • the device may also comprise, either as a part of the tracking system or as a part separated from the same, one or several built-in video or still cameras.
  • the camera images an area that is approxi- mately superimposed on the apparent area of the display. If the camera and the display are in the same position with respect to each other, it is not necessary to calibrate them with each other.
  • a video-based tracking system is used, pointing or selection by means of a finger of the user or another pointer is detected by means of image analysis.
  • a suitable light and possibly filters are utilized for tracking of the finger.
  • the pointing does not necessarily require an external de- vice, such as data gloves or a pointing stick, wherein a finger replaces a conventional mouse.
  • the apparatus and its different variations reduce muscle fatigue in the pointing action and it is intuitive and pleasant in use.
  • the device detects the movement of the hand only when desired or when it is within the visual range of a cam- era of a tracking system, and the eye is close to the ocular display, wherein unintentional gestures and pointing actions do not interfere with the actions of the user.
  • the device can also be constructed in such a manner that the camera or the ocular display can be turned downward, wherein it is not necessary to lift up the hand very high, and fatigue can be avoided.
  • the device can comprise several alternative displays or cameras.
  • the display can be seen in front of the user and the camera images the movements of the hand of the user diagonally downward, wherein the hand can be down or even on top of a table.
  • IR infrared
  • IR infrared
  • structured light sig- nificant ly facilitates the detection of the hand, because when IR light is used, it is possible to filter most of the other light away, thus saving computing efficiency.
  • IR-LED apparatus and a camera synchronized with the same, wherein in the image the background can be separated from the hand even more easily.
  • the camera takes every other image in the IR light and every other image without the same, the effect of the background can be eliminated in a simple manner.
  • the IR light can also be replaced with other kind of light, wherein it is for example possible to use suitable light spots or patterns, wherein the light in question is structured IR light or ultraviolet light or visible light, or weak laser light.
  • ultrasound One possibility is to use ultrasound alternatively, wherein the pointer devices listens to an ultrasound transmitter that can be located in the user interface device.
  • Fig. 1 shows a preferred embodiment of the device according to the invention
  • Fig. 2 illustrates the use of the device according to Fig. 1 ,
  • Fig. 3 shows a using action corresponding to Fig. 2
  • Fig. 4 shows the view of the display of the device when a semi- permeable display is used.
  • the tracking system is integrated in a user interface device.
  • the tracking system is based on video tracking.
  • one or several cameras 3 in the tracking system image the area that is approximately superimposed on the view of the display.
  • the tracking system is entirely integrated in the device.
  • the tracking system operates in a functional connection with the pointer device.
  • the tracking system is missing entirely, but the device contains the necessary interfaces for the couplings of such a system.
  • the tracking system can also include versions in which one or more transmitters are placed in the device for different pointer devices. Alternatively the camera is used for imaging of another area in which the hands are located.
  • the display is composed of optics 1 and a micro display 2.
  • the finger can function directly as a pointer similar to a mouse of a computer.
  • the finger of the user can been seen on top of the selection options, for example a menu (semi- permeable display) or the finger is represented by a cursor (non- transparent display) virtually in the right location.
  • a semi- permeable display is used, a semi-reflective mirror is also required.
  • the aforementioned area can also be imaged by means of a camera that is not part of the tracking system and the image is shown on top of the user interface.
  • the view of the environment can be switched off and only the pointer can be displayed.
  • the tracking of the pointing takes place by means of an arrangement that is separate from this camera or that is integrated in the device, or the coordinate information is obtained from an external arrangement, for example from a pointer device that is intended for interpretation of the pointing and transmission of the information relating thereto to the user interface device.
  • the pointing is for example a location and/or position when for example a 3D mouse is used.
  • the coordinates are calculated or the pointing is interpreted entirely in the pointer device, and the coordinate information and/or other information is transferred to the user interface device that is to be controlled.
  • the device can still comprise a tracking system, but it primarily only contains different trans- mitters without an analysing apparatus that is typically a processor system.
  • IR-LED devices 4 that transmit IR light, as well as suitable filters 5.
  • the camera can be an IR camera, although known less expensive CCD cameras also detect IR light quite well. A more expensive solution is the use of an actual thermal camera.
  • the device may contain several IR lights, or several display devices, or they can be turned into a suitable position. Two displays can also be stationary or turnable in such a manner that a separate image is presented for both eyes.
  • the user sees the desired user interface of an extemal device suitable for the situation that can be loaded via a wired or wireless connection device 6, wherein for example Bluetooth, Infrared (e.g. IrDA) GSM or RF (Radio Frequency) data transmission methods are used for the data transmission.
  • the device can function as a wireless user interface or as a remote controller for various different electronic devices.
  • the device can also function as a small, hand-held browser.
  • the act of tracking a hand or a finger from a video image is a heavy computing process, and generally known video tracking algorithms can be used in the tracking.
  • the tracking and calibration can be facilitated in many different ways, wherein it can be for example assumed that the hand is located in the vicinity and approximately in a certain position.
  • the user interface device can utilize IR light that is invisible to the eye and a suitable camera can be used to illuminate and track the hand. Because the operating radius is relatively small, the illumination does not consume a great deal of energy and for example IR-LED devices consume quite a small amount of energy.
  • the access of visible light to the tracking cameras can be prevented entirely, without restricting the travel of IR light, wherein a suitable filter is used, for example Kodak ® Wratten IR nr. 87.
  • a suitable filter for example Kodak ® Wratten IR nr. 87.
  • the brightest surfaces that reflect IR light are probably closest to the IR light and the camera. This filters the image automatically, because the background is more opaque, and facilitates the image processing. Different surfaces reflect IR light in different ways, for example human skin reflects IR light well. In the method it is also possible to track other objects.
  • the user interface When the user interface is transmitted by other external devices it is in HTLM format of a browser page or in another suitable format. Remote use of the devices is possible by utilizing the GSM system or another generally known method in the data transmission. It is also possible to apply a radio card attached to the device, wherein the device contains a standard expansion card interface.
  • the pointing accuracy of a finger is restricted because of its size.
  • a separate pointer device available, for example a suitable stick or rod that has a suitable colour or is reflective, or it can also transmit light, ultrasound or other suitable signal.
  • the tracking system of the user interface device can transmit light, ultrasound, or other suitable signal, said pointer device receiving and at the same time interpreting the pointing conducted by means of the same.
  • Ultrasound is first transmitted and then received by means of at least three receivers, wherein it is possible to determine at least the location in the space.
  • the control system of the user interface device and the control system of the tracking system of the same that receives information from a pointer device and thus functions as an analysing apparatus.
  • the resolution of the camera or the focal distance of the ob- jective can be constructed so that they are suitable and the size of the finger is not too large.
  • the virtual area of the ocular and the area imaged by the camera are not necessarily equally large.
  • the device can also be arranged to communicate with cameras, tracking devices of the finger or other corresponding devices in the surrounding space, and thus it is possible to further improve the estimation of the three-dimensional lo- cation and position of the finger.
  • ultrasound as radar by transmitting the ultrasound and listening to the reflections of the same to detect whether there is a hand nearby. Thus, vain attempts to search for the hand that is not in the visual range of the camera are not made. It is also possible to utilize ultrasound for rough determination of the distance of two hands.
  • the device is especially suitable for short-term actions, such as functioning as a remote controller, for operating a computer or browsing the Internet.
  • the shape of the device can resemble binoculars, a camera, a remote controller, a stick, or another corresponding device.
  • the device can cover the actual view or it can be partly transparent, wherein information is shown on top of the actual view.
  • a WIMP user interface that is controlled by means of a finger.
  • the finger of the user can replace a mouse and the cursor moves with the finger. It is usually not necessary to focus the user interface with the actual view, if the embodiment in question does not especially require such an action.
  • the selection of the pointing takes place for example by pressing a button, by pointing for a suffi- ciently long time, with a gesture, saying the selection out loud, observing the EEG waves of brains, EMG waves of muscles, EOG waves of eyes, or in another manner.
  • the user interface device is especially well suited for WIMP user inter- faces, but it can be expanded for other purposes, for example for different 3D user interfaces.
  • the user interface device On the basis of the need it is possible to produce a group of different implementations of the user interface device. Instead of a finger it is possible to use another body part, a special pointer device or for example a hand-held object. It can also be supplemented with func- tions of other devices, such as a camera, a mobile station or a phone.
  • the user interface device can track the movements of the user's hand by means of a video camera, an IR camera or another suitable camera. There may be several cameras, wherein it is easy to calculate the dis- tance. The smallest CCD and CMOS cameras weigh only few grams. CCD linear sensors enable high resolution and fast receipt of image. Linear CCD cameras are also affordable. Several linear CCD cameras are necessary for determining the location. Light sensing diodes and LED lights are also suitable for tracking. There are also different known optical lighthouses in which light coded in a determined manner is transmitted from several lighthouses to the environment, and the re- ceiving sensor transmits the information to computing means that calculate the location. By monitoring the location and the changes it is possible to interpret a pointing that has been made.
  • the retina of the eye reflects IR light especially well.
  • the ocular can be supplemented with different eye tracking systems that are disclosed for example in US patent publication 6055110 or that are known in the selection of the focusing area in Canon ® systems cameras.
  • the element can be activated or selected. The selection can be based for example on the pressing of a button, or on the use of speech or sound.
  • the device can also contain a sensor detecting the touch of a hand or the vicinity of an eye that activates the device from so-called sleep mode, wherein energy is saved.
  • the user interface device may contain position sensors, magnetic sensors, gyroscopes, satellite tracking devices, ultrasound sensors, cameras, or corresponding devices by means of which it is possible to determine the position and location of the device.
  • position sensors magnetic sensors, gyroscopes, satellite tracking devices, ultrasound sensors, cameras, or corresponding devices by means of which it is possible to determine the position and location of the device.
  • Different objects can be manipulated for example by means of video analysis.
  • 3D graphics it is possible to produce icons, menus or other objects that can be touched virtually, and that be dragged, opened, turned or subjected to other kinds of actions.
  • a conventional WIMP menu can also have another side that becomes visible when it is turned around.
  • different kinds of games that react to the position of fingers, hands or other objects are possible.
  • the user interface device can contain speech recognition, recognition of the speaker, recognition of the user by the face, eye, or fingerprints, or other biometric recognition methods, security methods or built-in encryption methods.
  • the device can be separate from other devices and it can also be built- in or connected to different devices. It can also contain functions of other devices, and function as a camera, a clock, a recording device or a telephone.
  • a method for tracking a hand or another object In some methods of 3D modelling a so-called structured light, for example suitable light spots, lines or lattices are used.
  • suitably structured IR light it is also possible to transmit suitably structured IR light to the environment, wherein it can be more easily distinguished from other IR sources that include natural light and lamps, but for example recognition of shapes is easier.
  • the light source can also be affordable, low-power IR laser whose light can be scattered by means of commercial components for example into light spots of desired shape.
  • laser distance meters One possibility is to use laser distance meters.
  • the system only receives reflected light moduled by the laser source and rejects the background light, which facilitates the tracking.
  • Such devices are com-tapally available.
  • a pointer device that forms a part of the tracking system and that is for example a separate pointer device coupled as an auxiliary device
  • it can be for example a 3D mouse according to related art. They usually require transmitters, receivers and other auxiliary de- vices located in the environment.
  • a known mouse is for example the Logitech 3D mouse that is suitable for processing even 6 degrees of freedom and that comprises a receiver that receives signals of an ultrasound transmitter for tracking. The transmitter or receiver can be separate from the user interface device.
  • the user interface device processes the obtained pointing and/or tracking information sufficiently so that the pointing would be intuitive.
  • the pointer device is for example an autonomous tracking system that interprets its environment in such a manner that it can track itself and function as a pointer for different devices.
  • the tracking system possibly utilizes for example passive reflectors or the like, or it can utilize a signal actively transmitted and/or received by itself and/or the environment.
  • Different kinds of tracking methods are generally known for example in virtual reality.
  • a known process is also tracking based on an electromagnetic method and one method is the use of small acceleration transducers or inertia sensors and/or gyroscopes that detect movements.
  • electromechanic methods that are coupled between the pointer device and the user interface device, and the posi- tion and movements of the same can be utilized to interpret the pointing.
  • a cord-based method As an example it is possible to mention a cord-based method.
  • the pointer device communicates by means of a wired or wireless data transmission connection at least with the user interface device for whose pointing it is intended.
  • the data transmission takes place in a suitable format.
  • the pointer device can be a hand-held device, or it can be attached for example to a finger or to a hand. It can be connected or integrated for example to a glove, a watch, a ring, a mobile phone or a PDA device (Personal Digital Assistant).
  • One pointer is especially a suitable wrist- band or a watch that is usually always carried along.
  • Different tracking systems can be composed of several parts of which at least some can be connected as an auxiliary device to the user interface device, for example a transmitter or a receiver, and the pointing part intended for pointing is a separate part that communicates with the other parts.
  • the user interface device contains a suitable data transmission bus, wherein it can be equipped with the desired tracking systems or pointer devices.
  • the device suitably comprises the necessary connectors for a conventional mouse or the like and for the purpose of conducting 2D pointing.
  • Alternative devices for 2D pointing and simple tracking can also be integrated in the device itself, wherein it is possible to use for example keys, a touch sensitive surface, a joystick type mechanism or tracking keys or control balls.

Abstract

System for establishing a user interface that comprises at least a user interface device that is portable and small in size, wherein it is suitable to be held on the palm, and that comprises at least one ocular display (1, 2), that is intended to be viewed when the user interface device is positioned in front of the eye of the user. The user inter-face device is intended for presenting the user interface and pointing, said pointing being intended for controlling the user interface and arranged to be performed apart from the device, in such a manner that the pointing is intuitive with respect to the view of the ocular display (1,2), and at least a first data transmission bus (6) that is intended for communication between the device and an-other apparatus, for which said user interface device is arranged to function as a user interface.

Description

SYSTEM FOR ESTABLISHING A USER INTERFACE
The present invention relates to a system according to the preamble of the appended claim 1 for establishing a user interface.
The development of electronics and other technology has constantly reduced the size and weight of different kinds of devices. On the other hand, many devices are about to be digitalized, and they are becoming more versatile. The usability and user interfaces of devices become more and more important, which is in contrast with the reduction of size. Different kinds of electronic devices are changing from electrically restricted devices into user interface restricted devices. If they become even smaller in size, their use will become more complicated, and better user interfaces are necessary.
While the devices become smaller, the displays should become larger. Different kinds of virtual displays have been suggested as a solution; which displays are placed in front of the eyes or on the head, or held in hand, wherein a picture larger than the display is seen by means of optics. The user interfaces for making selections and for other functions have been implemented for example by means of push buttons in separate devices.
A good user interface is as intuitive as possible, it blends in to the nor- mal life, and it is not always necessary to separately learn to use the user interface. The most well-known user interface is the keyboard and mouse of a computer. Thus the user's eyes are on the display in which a cursor moved by the mouse moves on a two-dimensional surface.
Computers typically contain a graphical user interface (GUI) that is based on a WIMP system (Windows-Icon-Menu-Pointer). Other electric devices also contain a user interface that is typically attached to the device or it is separate remote controller.
In the recent years for example the research of virtual reality and multi- modal user interfaces has presented many human-oriented and sense- oriented ways of communicating with a computer. For example several senses and other hints are utilized therein. The methods include for example speech and eye recognition, recognition of a hand, expression or gesture that is based on computer vision, or touch feedback. The devices to be used include for example a mouse, a display, a touch screen, a pen, a light pen, a data glove, a pedal, a virtual visor, a force feedback device, or another multimodal device.
In the following some known user interfaces and the devices used therein will be presented.
One known wireless user interface device is Xeroc® PARCTAB that is a palmtop computer equipped with a touch screen and that communicates wirelessly with the applications of the work station by means of an infrared link. It is a purpose of the device to function is a user interface device. In a so-called Ubiquitous Computing principle the wireless connection to the devices in the environment is of great importance. There is also a known inViso® eCase PDA device that is equipped with a micro display in which information is presented in a manner similar to a normal-sized display. The device has a touch sensitive surface that functions as a mouse, as well as control buttons, and by means of aux- iliary devices it can be used for wireless data retrieval and communication for example via e-mail. There are also other universal information appliances (Eustice, K., Lehman, T., Morales, A., Munson, M., Edlund, S., Guillen, M.: A universal information appliance. IBM Systems Journal, Vol. 38, No. 4, 1999, pp. 575-601).
In augmented reality (AR) and virtual reality (VR) virtual visors are used that are display devices to be worn on the head, by means of which an image of the surrounding world is created in virtual reality. The virtual visor and the data glove are tracked with special tracking devices that function electromechanically, electromagnetically, acoustically or optically. Augmented reality utilizes semi-permeable displays, wherein the real view is complemented with artificial objects that are aligned on top of the view.
There is also a known AR system (Feiner, S., Maclntyre, B., Haupt, M., & Solomon, E.: Windows on the world: 2D windows for 3D augmented reality, Proc. UIST '93 (ACM Symp. on User Interface Software and Technology), Atlanta, GA, November 3th to 5th, 1993, pp. 145 - 155), in which 2D windows are added on top of the real view. The windows may be locked to the co-ordinates of the world, move along with the visor or be locked to actual objects.
So-called wearable computing principle is related to augmented reality, and therein the display and the computer move along with and can be worn by the user. The devices have been developed for example in the MIThrill project (MIT Media Lab, USA) and in the HIT Lab laboratory (Human Interface Technology Lab, University of Washington, USA). The system that typically utilizes a wirelessly operating apparatus, a small display attached to glasses or a covering display (HMD, Head Mounted Display), as well as a separate hand-held control device that resembles a palmtop microcomputer or is a keyboard equipped with a cable. It is also possible to use head mounted cameras and new kinds of I/O devices (Input/Output) such as video analysis in which a finger is used as a pointer device.
There is also a known markup language called MARISIL (Pulli, P., An- toniac, P.: MARISIL - User Interface Framework for Future Mobile MediaPhone, Proceedings of the 2nd International Symposium on Mobile Multimedia Systems & Applications (MMSA 2000), pp. 1 - 5, Delft) that functions in an AR user interface. Virtual keys are projected on the palm by means of the display of a semi-permeable visor, wherein the virtual display is directed on the palm of the user, and a video camera is used for recognition of the movements of the other hand that make selections. Thus, it is possible to select phone numbers or give commands without any hand-held devices. The pointer device that makes selections may also be equipped with a magnetic tracking device.
There are also many known methods in which a multimodal input device is based on expressions, gestures, posture of the body, or speech, wherein devices held by or installed on the user are not utilized. The user can for example be followed by means of cameras, and for example a pointing by a hand can be detected for example by means of computer vision or video tracking by infrared light. One known implementation is also a finger functioning as a mouse that is analysed by means of a video image (Takeshi Kurata, Takashi Okuma, Masakatsu Kourogi, & Katsuhiko Sakaue: The Hand-mouse: A Human Interface Suitable for Augmented Reality Environments Enabled by Visual Wearables, In Proc. ISMR 2001 in Yokohama, Japan, pp. 188-189).
US patent publication 6091546 discloses an apparatus in the form of glasses for a user interface that contains for example a display, a video camera, a microphone and other apparatuses. US patent publication 6064354 discloses a method and apparatus by means of which a three-dimensional impression and user interface are produced by means of a screen. The pointer is for example a finger that is tracked by means of video cameras and a suitable video analysis and an algorithm. In US patent publication 6204852 two different cameras and a video analysis are used for determining the posture and location of the user's hand, and this is used as an input for the computer. A computer apparatus that utilizes video analysis is also disclosed in US patent publication 5168531. Systems are used in connection with different kinds of computer apparatuses and displays, wherein an alternative pointing method is attained by means of the video analysis.
It is a purpose of the present invention to present a new type of a user interface device and system, wherein the restrictions occurring in the related art can be reduced. The user interface device can be easily utilized in operation with numerous different devices.
The invention relates to a system containing a small user interface device that moves with the user and that is intended to be hand-held and preferably also carried along, and that forms the user interface of another device, preferably in a wireless manner. It can function as a general remote controller for different kinds of electronic devices, or it can function as an assistive device in retrieving and viewing information, thus operating for example as a browser. The device to be operated can be located close by, or the operation may occur from a distance. The user interface device can also function as a built-in or integrated part of another device. The pointer is a suitable apparatus, arrangement or for example merely a finger, wherein the device preferably functions in such a manner that separate receivers, trans- mitters, sensors, light sources, reflectors or other identifiers are not necessary in the hand that operates as a pointer.
The system according to the invention for a user interface is presented in claim 1.
In the invention a virtual, intuitive user interface is produced by means of a small device. In an embodiment a tracking system is integrated in the apparatus for the purpose of pointing and for the interpretation of the same, wherein the basic idea is to use for example a camera and an ocular display in such a manner that for example the act of pointing with a finger is intuitive with respect to the view of the display. The finger is tracked for example by means of video analysis. The device contains a wired or wireless connection to an external, typically sepa- rate device for which said device operates as a user interface device, the user interface of said device being presented on the ocular display of the device. The device may also be integrated for example in a remote controller or a mobile phone.
The device to be controlled comprises means which are utilized to transfer the user interface into the user interface device, whose control system and data transmission means are arranged for this purpose. The control system of the device to be controlled is arranged to communicate in a suitable manner with the user interface device and to control its function on the basis of the commands of the same. The device to be controlled may also entirely lack physical devices, for example a keyboard or a display for direct control or use. The device to be controlled may be for example an independent device without telecommunication connections, or a computer apparatus.
The user interface device comprises one or more ocular display devices. The device may also comprise, either as a part of the tracking system or as a part separated from the same, one or several built-in video or still cameras. The camera images an area that is approxi- mately superimposed on the apparent area of the display. If the camera and the display are in the same position with respect to each other, it is not necessary to calibrate them with each other. When a video-based tracking system is used, pointing or selection by means of a finger of the user or another pointer is detected by means of image analysis. A suitable light and possibly filters are utilized for tracking of the finger. The pointing does not necessarily require an external de- vice, such as data gloves or a pointing stick, wherein a finger replaces a conventional mouse. The apparatus and its different variations reduce muscle fatigue in the pointing action and it is intuitive and pleasant in use. Preferably the device detects the movement of the hand only when desired or when it is within the visual range of a cam- era of a tracking system, and the eye is close to the ocular display, wherein unintentional gestures and pointing actions do not interfere with the actions of the user.
The device can also be constructed in such a manner that the camera or the ocular display can be turned downward, wherein it is not necessary to lift up the hand very high, and fatigue can be avoided. Alternatively, the device can comprise several alternative displays or cameras. For example the display can be seen in front of the user and the camera images the movements of the hand of the user diagonally downward, wherein the hand can be down or even on top of a table.
Alternatively it is possible to replace a conventional camera with a build-in infrared (IR) apparatus that comprises an IR transmitter, an IR filter and an IR camera. The use of infrared light or structured light sig- nificantly facilitates the detection of the hand, because when IR light is used, it is possible to filter most of the other light away, thus saving computing efficiency. If necessary, it is possible to illuminate the hand by means of visible light as well, which does not disturb the user, but others will notice the light.
It is also possible to use a suitably flashing IR-LED apparatus and a camera synchronized with the same, wherein in the image the background can be separated from the hand even more easily. When the camera takes every other image in the IR light and every other image without the same, the effect of the background can be eliminated in a simple manner. The IR light can also be replaced with other kind of light, wherein it is for example possible to use suitable light spots or patterns, wherein the light in question is structured IR light or ultraviolet light or visible light, or weak laser light. One possibility is to use ultrasound alternatively, wherein the pointer devices listens to an ultrasound transmitter that can be located in the user interface device.
In the following description, the invention will be described in more detail with reference to the appended drawings, in which
Fig. 1 shows a preferred embodiment of the device according to the invention,
Fig. 2 illustrates the use of the device according to Fig. 1 ,
Fig. 3 shows a using action corresponding to Fig. 2, and
Fig. 4 shows the view of the display of the device when a semi- permeable display is used.
It is a natural way for people to show and select things by pointing them with a finger. The coordination of the human hand and eye is accurate and intuitive in the pointing.
In the following, a preferred embodiment will be discussed as an example, in which the tracking system is integrated in a user interface device. In this example, the tracking system is based on video tracking. When video tracking is used, one or several cameras 3 in the tracking system image the area that is approximately superimposed on the view of the display. In the presented preferred embodiment the tracking system is entirely integrated in the device. In other embodiments the tracking system operates in a functional connection with the pointer device. In one embodiment the tracking system is missing entirely, but the device contains the necessary interfaces for the couplings of such a system. The tracking system can also include versions in which one or more transmitters are placed in the device for different pointer devices. Alternatively the camera is used for imaging of another area in which the hands are located. The display is composed of optics 1 and a micro display 2. When the motion of the hand or fingers is tracked by processing the video image, the finger can function directly as a pointer similar to a mouse of a computer. Thus, the finger of the user can been seen on top of the selection options, for example a menu (semi- permeable display) or the finger is represented by a cursor (non- transparent display) virtually in the right location. When a semi- permeable display is used, a semi-reflective mirror is also required.
The aforementioned area can also be imaged by means of a camera that is not part of the tracking system and the image is shown on top of the user interface. When necessary, the view of the environment can be switched off and only the pointer can be displayed. The tracking of the pointing takes place by means of an arrangement that is separate from this camera or that is integrated in the device, or the coordinate information is obtained from an external arrangement, for example from a pointer device that is intended for interpretation of the pointing and transmission of the information relating thereto to the user interface device. The pointing is for example a location and/or position when for example a 3D mouse is used. If necessary, the coordinates are calculated or the pointing is interpreted entirely in the pointer device, and the coordinate information and/or other information is transferred to the user interface device that is to be controlled. The device can still comprise a tracking system, but it primarily only contains different trans- mitters without an analysing apparatus that is typically a processor system.
To facilitate and accelerate the tracking it is possible to use IR-LED devices 4 that transmit IR light, as well as suitable filters 5. The camera can be an IR camera, although known less expensive CCD cameras also detect IR light quite well. A more expensive solution is the use of an actual thermal camera.
According to Fig. 1 , the device may contain several IR lights, or several display devices, or they can be turned into a suitable position. Two displays can also be stationary or turnable in such a manner that a separate image is presented for both eyes. The user sees the desired user interface of an extemal device suitable for the situation that can be loaded via a wired or wireless connection device 6, wherein for example Bluetooth, Infrared (e.g. IrDA) GSM or RF (Radio Frequency) data transmission methods are used for the data transmission. The device can function as a wireless user interface or as a remote controller for various different electronic devices. The device can also function as a small, hand-held browser.
The act of tracking a hand or a finger from a video image is a heavy computing process, and generally known video tracking algorithms can be used in the tracking. The tracking and calibration can be facilitated in many different ways, wherein it can be for example assumed that the hand is located in the vicinity and approximately in a certain position.
In an alternative embodiment the user interface device can utilize IR light that is invisible to the eye and a suitable camera can be used to illuminate and track the hand. Because the operating radius is relatively small, the illumination does not consume a great deal of energy and for example IR-LED devices consume quite a small amount of energy. The access of visible light to the tracking cameras can be prevented entirely, without restricting the travel of IR light, wherein a suitable filter is used, for example Kodak® Wratten IR nr. 87. Thus, the brightest surfaces that reflect IR light are probably closest to the IR light and the camera. This filters the image automatically, because the background is more opaque, and facilitates the image processing. Different surfaces reflect IR light in different ways, for example human skin reflects IR light well. In the method it is also possible to track other objects.
When the user interface is transmitted by other external devices it is in HTLM format of a browser page or in another suitable format. Remote use of the devices is possible by utilizing the GSM system or another generally known method in the data transmission. It is also possible to apply a radio card attached to the device, wherein the device contains a standard expansion card interface. The pointing accuracy of a finger is restricted because of its size. For more accurate pointing there is a separate pointer device available, for example a suitable stick or rod that has a suitable colour or is reflective, or it can also transmit light, ultrasound or other suitable signal. In a corresponding manner, the tracking system of the user interface device can transmit light, ultrasound, or other suitable signal, said pointer device receiving and at the same time interpreting the pointing conducted by means of the same. Ultrasound is first transmitted and then received by means of at least three receivers, wherein it is possible to determine at least the location in the space. In the interpretation it is more or less possible to utilize the control system of the user interface device and the control system of the tracking system of the same that receives information from a pointer device and thus functions as an analysing apparatus. The resolution of the camera or the focal distance of the ob- jective can be constructed so that they are suitable and the size of the finger is not too large. Thus the virtual area of the ocular and the area imaged by the camera are not necessarily equally large.
On the basis of the size of the finger it is possible to roughly estimate the distance of the finger in the image of the camera. A better evaluation can be attained by using several cameras. The device can also be arranged to communicate with cameras, tracking devices of the finger or other corresponding devices in the surrounding space, and thus it is possible to further improve the estimation of the three-dimensional lo- cation and position of the finger.
It is possible to use ultrasound as radar by transmitting the ultrasound and listening to the reflections of the same to detect whether there is a hand nearby. Thus, vain attempts to search for the hand that is not in the visual range of the camera are not made. It is also possible to utilize ultrasound for rough determination of the distance of two hands.
Only one person can see the display, and thus outsiders cannot take a peak over the shoulder of the person in question. On the other hand, the display is not constantly attached to the head, thus covering the actual view, because the display is looked at only when it is necessary. The convenience of use is thus better for example when moving on the town or in short tasks. The device is especially suitable for short-term actions, such as functioning as a remote controller, for operating a computer or browsing the Internet. The shape of the device can resemble binoculars, a camera, a remote controller, a stick, or another corresponding device.
The device can cover the actual view or it can be partly transparent, wherein information is shown on top of the actual view. Thus, it is possible to use for example a WIMP user interface that is controlled by means of a finger. The finger of the user can replace a mouse and the cursor moves with the finger. It is usually not necessary to focus the user interface with the actual view, if the embodiment in question does not especially require such an action. The selection of the pointing takes place for example by pressing a button, by pointing for a suffi- ciently long time, with a gesture, saying the selection out loud, observing the EEG waves of brains, EMG waves of muscles, EOG waves of eyes, or in another manner.
The user interface device is especially well suited for WIMP user inter- faces, but it can be expanded for other purposes, for example for different 3D user interfaces. Thus, it is possible to use 3D graphics, point and manipulate objects in the depth direction instead of a plane, or produce a stereo view containing the depth by adding a second display.
On the basis of the need it is possible to produce a group of different implementations of the user interface device. Instead of a finger it is possible to use another body part, a special pointer device or for example a hand-held object. It can also be supplemented with func- tions of other devices, such as a camera, a mobile station or a phone.
The user interface device can track the movements of the user's hand by means of a video camera, an IR camera or another suitable camera. There may be several cameras, wherein it is easy to calculate the dis- tance. The smallest CCD and CMOS cameras weigh only few grams. CCD linear sensors enable high resolution and fast receipt of image. Linear CCD cameras are also affordable. Several linear CCD cameras are necessary for determining the location. Light sensing diodes and LED lights are also suitable for tracking. There are also different known optical lighthouses in which light coded in a determined manner is transmitted from several lighthouses to the environment, and the re- ceiving sensor transmits the information to computing means that calculate the location. By monitoring the location and the changes it is possible to interpret a pointing that has been made.
Alternatively it is possible to use other methods in the tracking, for example monitoring of the direction of the eye. The retina of the eye reflects IR light especially well. The ocular can be supplemented with different eye tracking systems that are disclosed for example in US patent publication 6055110 or that are known in the selection of the focusing area in Canon® systems cameras. When the eye is directed to an element for a longer period of time, the element can be activated or selected. The selection can be based for example on the pressing of a button, or on the use of speech or sound.
The device can also contain a sensor detecting the touch of a hand or the vicinity of an eye that activates the device from so-called sleep mode, wherein energy is saved.
The user interface device may contain position sensors, magnetic sensors, gyroscopes, satellite tracking devices, ultrasound sensors, cameras, or corresponding devices by means of which it is possible to determine the position and location of the device. Thus it is possible to produce user interface views dependent on the position or location for example a desk that is always around, a personal data space or a 3D desk, wherein it is possible to easily see the alternatives or for example icons by looking around. Thus, it is possible to utilize the human spatial memory.
Different objects can be manipulated for example by means of video analysis. By means of 3D graphics it is possible to produce icons, menus or other objects that can be touched virtually, and that be dragged, opened, turned or subjected to other kinds of actions. Thus, for example a conventional WIMP menu can also have another side that becomes visible when it is turned around. Thus, different kinds of games that react to the position of fingers, hands or other objects are possible.
The user interface device can contain speech recognition, recognition of the speaker, recognition of the user by the face, eye, or fingerprints, or other biometric recognition methods, security methods or built-in encryption methods.
When the orientation of the display is tracked, it is possible to produce a three-dimensional data environment or desk that surrounds the user. Stereoscopic viewing is also possible by using a separate display for both eyes. Thus it is also possible to stereoscopically show elements that have depth.
The device can be separate from other devices and it can also be built- in or connected to different devices. It can also contain functions of other devices, and function as a camera, a clock, a recording device or a telephone.
In the following a method for tracking a hand or another object will be described. In some methods of 3D modelling a so-called structured light, for example suitable light spots, lines or lattices are used. Instead of conventional IR light it is also possible to transmit suitably structured IR light to the environment, wherein it can be more easily distinguished from other IR sources that include natural light and lamps, but for example recognition of shapes is easier. The light source can also be affordable, low-power IR laser whose light can be scattered by means of commercial components for example into light spots of desired shape.
One possibility is to use laser distance meters. Thus the system only receives reflected light moduled by the laser source and rejects the background light, which facilitates the tracking. Such devices are com- mercially available. By means of some laser distance meters it is possible to transmit a light pulse with a very short duration, wherein it is only possible to process light reflected from objects within a fixed distance. This may again facilitate the tracking.
In the case of a pointer device that forms a part of the tracking system and that is for example a separate pointer device coupled as an auxiliary device, it can be for example a 3D mouse according to related art. They usually require transmitters, receivers and other auxiliary de- vices located in the environment. A known mouse is for example the Logitech 3D mouse that is suitable for processing even 6 degrees of freedom and that comprises a receiver that receives signals of an ultrasound transmitter for tracking. The transmitter or receiver can be separate from the user interface device. On the basis of the received infor- mation the user interface device processes the obtained pointing and/or tracking information sufficiently so that the pointing would be intuitive.
The pointer device is for example an autonomous tracking system that interprets its environment in such a manner that it can track itself and function as a pointer for different devices. The tracking system possibly utilizes for example passive reflectors or the like, or it can utilize a signal actively transmitted and/or received by itself and/or the environment. Different kinds of tracking methods are generally known for example in virtual reality. A known process is also tracking based on an electromagnetic method and one method is the use of small acceleration transducers or inertia sensors and/or gyroscopes that detect movements. There are also electromechanic methods that are coupled between the pointer device and the user interface device, and the posi- tion and movements of the same can be utilized to interpret the pointing. As an example it is possible to mention a cord-based method. The pointer device communicates by means of a wired or wireless data transmission connection at least with the user interface device for whose pointing it is intended. The data transmission takes place in a suitable format. The pointer device can be a hand-held device, or it can be attached for example to a finger or to a hand. It can be connected or integrated for example to a glove, a watch, a ring, a mobile phone or a PDA device (Personal Digital Assistant). One pointer is especially a suitable wrist- band or a watch that is usually always carried along. Different tracking systems can be composed of several parts of which at least some can be connected as an auxiliary device to the user interface device, for example a transmitter or a receiver, and the pointing part intended for pointing is a separate part that communicates with the other parts. For this purpose the user interface device contains a suitable data transmission bus, wherein it can be equipped with the desired tracking systems or pointer devices. In its basic form the device suitably comprises the necessary connectors for a conventional mouse or the like and for the purpose of conducting 2D pointing. Alternative devices for 2D pointing and simple tracking can also be integrated in the device itself, wherein it is possible to use for example keys, a touch sensitive surface, a joystick type mechanism or tracking keys or control balls.
By combining the methods, modes of operation and apparatus struc- tures presented above in connection with the different embodiments of the invention, it is possible to provide various advantageous embodiments of the user interface device.
The invention is not restricted to the example presented above in the drawings, and the dimensions of the structures and components presented therein do not necessarily comply with reality, but the purpose of them is to illustrate the invention. The invention can be applied within the scope of the appended claims.

Claims

Claims
1. A system for establishing a user interface that comprises at least a user interface device that is portable and small in size, wherein it is suitable to be held on the palm, and that comprises at least one ocular display (1 , 2), that is intended to be viewed when the user interface device is positioned in front of the eye of the user, characterized in that the user interface device is intended for presenting the user interface and pointing, said pointing being intended for controlling the user inter- face and arranged to be performed apart from the device, in such a manner that the pointing is intuitive with respect to the view of the ocular display (1 ,2), and at least a first data transmission bus (6) that is intended for communication between the device and another apparatus, for which said user interface device is arranged to function as a user interface.
2. The system according to claim 1 , characterized in that the user interface device comprises at least one camera that is intended to image the environment in the desired direction, and an ocular display (1 , 2) is arranged to present the user interface and the view of the environment to said desired direction on top of each other.
3. The system according to claim 1 , characterized in that the ocular display (1 , 2) is semi-permeable or arranged to be semi-permeable in such a manner that the user interface and the view of the environment can be presented on top of each other on the ocular display (1 , 2).
4. The system according to any of the claims 1 to 3, characterized in that it comprises at least one tracking system (3, 4, 5) that is intended for interpretation of pointing, said pointing being intended to be performed for example by means of the movement of a finger and/or a separate pointer device belonging to the tracking system (3, 4, 5).
5. The system according to claim 4, characterized in that tracking system (3, 4, 5) is at least partly arranged in the device.
6. The system according to claim 4 or 5, characterized in that the user interface device comprises at least a second data transmission bus that is intended either for the communication between the tracking system and the user interface device, or for the communication between the integrated tracking system and the pointer device and for transmission of information relating to the pointing to the device.
7. The system according to claim 4 or 5, characterized in that the tracking system (3, 4, 5) comprises one or more cameras and an ana- lysing apparatus that is intended for the interpretation of pointing.
8. The system according to claim 4 or 5, characterized in that the tracking system is arranged especially for interpretation of a pointing performed by means of a finger of the user.
9. The system according to any of the claims 1 to 8, characterized in that the ocular display (1 , 2) is arranged to show the user interface and the view of the environment to the desired direction on top of each other that is preferably the direction in which the pointing is intended to be made.
10. The system according to any of the claims 1 to 9, characterized in that the user interface device comprises means for acknowledging the selection made by pointing.
11. The system according to any of the claims 1 to 10, characterized in that the ocular display (1 , 2) is arranged to show a pointer moving in a manner corresponding to the pointing.
12. The system according to any of the claims 1 to 11 , characterized in that the user interface device comprises means for interpreting the location or position of the device, wherein the user interface shown in the ocular display (1 , 2) is also arranged dependent on the location or the position.
13. The system according to any of the claims 1 to 12, characterized in that the user interface device is built in another device.
14. The system according to any of the claims 4 to 8, characterized in that the positioning system comprises one or more transmitters that transmit light signals, sound signals or radio signals for the interpreta- tion of pointing, and/or the tracking system comprises one or more receivers that receive light signals, sound signals or radio signals for the interpretation of pointing.
15. The system according to any of the claims 4 to 8, characterized in that the tracking device is an auxiliary device that is arranged to be coupled to a user interface device.
16. The system according to any of the claims 1 to 15, characterized in that the user interface device is equipped with functions of other hand-held devices, such as mobile station functions and/or camera functions.
PCT/FI2002/000526 2001-06-21 2002-06-17 System for establishing a user interface WO2003003185A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
FI20011321A FI20011321A0 (en) 2001-06-21 2001-06-21 General purpose user interface device
FI20011321 2001-06-21
FI20012231 2001-11-16
FI20012231A FI20012231A (en) 2001-06-21 2001-11-16 System for creating a user interface

Publications (1)

Publication Number Publication Date
WO2003003185A1 true WO2003003185A1 (en) 2003-01-09

Family

ID=26161188

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2002/000526 WO2003003185A1 (en) 2001-06-21 2002-06-17 System for establishing a user interface

Country Status (2)

Country Link
FI (1) FI20012231A (en)
WO (1) WO2003003185A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006036069A1 (en) * 2004-09-27 2006-04-06 Hans Gude Gudensen Information processing system and method
WO2006040740A1 (en) * 2004-10-15 2006-04-20 Philips Intellectual Property & Standard Gmbh System for 3d rendering applications using hands
WO2006092399A1 (en) * 2005-03-04 2006-09-08 Siemens Aktiengesellschaft Method for marking an image content, and marking device
WO2006136738A2 (en) * 2005-06-24 2006-12-28 Daniel Martin Digital pointing device on a screen
EP1902352A1 (en) * 2005-07-14 2008-03-26 Crucialtec Co., Ltd. Ultra thin optical pointing device and personal portable device having the same
WO2010076375A1 (en) * 2008-12-31 2010-07-08 Nokia Corporation Display apparatus and device
CN103677255A (en) * 2012-09-03 2014-03-26 三星电子株式会社 Method and apparatusfor extracting three-dimensional distance information, terminals and gesture operation method
WO2014076236A1 (en) * 2012-11-15 2014-05-22 Steen Svendstorp Iversen Method of providing a digitally represented visual instruction from a specialist to a user in need of said visual instruction, and a system therefor
EP2765502A1 (en) * 2013-02-08 2014-08-13 ShowMe Telepresence ApS Method of providing a digitally represented visual instruction from a specialist to a user in need of said visual instruction, and a system therefore
US9547368B2 (en) 2009-03-18 2017-01-17 Hj Laboratories Licensing, Llc Electronic device with a pressure sensitive multi-touch display
US10496170B2 (en) 2010-02-16 2019-12-03 HJ Laboratories, LLC Vehicle computing system to provide feedback
US11153472B2 (en) 2005-10-17 2021-10-19 Cutting Edge Vision, LLC Automatic upload of pictures from a camera
DE102005061211B4 (en) 2004-12-22 2023-04-06 Abb Schweiz Ag Method for creating a human-machine user interface

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0679984A1 (en) * 1994-04-22 1995-11-02 Canon Kabushiki Kaisha Display apparatus
EP0935183A2 (en) * 1998-02-09 1999-08-11 Sel Semiconductor Energy Laboratory Co., Ltd. Information processing device with head mounted display
US5977950A (en) * 1993-11-29 1999-11-02 Motorola, Inc. Manually controllable cursor in a virtual image
WO2000052536A1 (en) * 1999-03-02 2000-09-08 Siemens Aktiengesellschaft Operating and monitoring system utilizing augmented reality technology

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5977950A (en) * 1993-11-29 1999-11-02 Motorola, Inc. Manually controllable cursor in a virtual image
EP0679984A1 (en) * 1994-04-22 1995-11-02 Canon Kabushiki Kaisha Display apparatus
EP0935183A2 (en) * 1998-02-09 1999-08-11 Sel Semiconductor Energy Laboratory Co., Ltd. Information processing device with head mounted display
WO2000052536A1 (en) * 1999-03-02 2000-09-08 Siemens Aktiengesellschaft Operating and monitoring system utilizing augmented reality technology

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006036069A1 (en) * 2004-09-27 2006-04-06 Hans Gude Gudensen Information processing system and method
WO2006040740A1 (en) * 2004-10-15 2006-04-20 Philips Intellectual Property & Standard Gmbh System for 3d rendering applications using hands
DE102005061211B4 (en) 2004-12-22 2023-04-06 Abb Schweiz Ag Method for creating a human-machine user interface
WO2006092399A1 (en) * 2005-03-04 2006-09-08 Siemens Aktiengesellschaft Method for marking an image content, and marking device
WO2006136738A2 (en) * 2005-06-24 2006-12-28 Daniel Martin Digital pointing device on a screen
WO2006136738A3 (en) * 2005-06-24 2007-03-22 Daniel Martin Digital pointing device on a screen
EP1902352A1 (en) * 2005-07-14 2008-03-26 Crucialtec Co., Ltd. Ultra thin optical pointing device and personal portable device having the same
EP1902352A4 (en) * 2005-07-14 2011-02-02 Crucialtec Co Ltd Ultra thin optical pointing device and personal portable device having the same
US11818458B2 (en) 2005-10-17 2023-11-14 Cutting Edge Vision, LLC Camera touchpad
US11153472B2 (en) 2005-10-17 2021-10-19 Cutting Edge Vision, LLC Automatic upload of pictures from a camera
WO2010076375A1 (en) * 2008-12-31 2010-07-08 Nokia Corporation Display apparatus and device
US7997723B2 (en) 2008-12-31 2011-08-16 Nokia Corporation Display apparatus and device
US9772772B2 (en) 2009-03-18 2017-09-26 Hj Laboratories Licensing, Llc Electronic device with an interactive pressure sensitive multi-touch display
US9547368B2 (en) 2009-03-18 2017-01-17 Hj Laboratories Licensing, Llc Electronic device with a pressure sensitive multi-touch display
US9778840B2 (en) 2009-03-18 2017-10-03 Hj Laboratories Licensing, Llc Electronic device with an interactive pressure sensitive multi-touch display
US10191652B2 (en) 2009-03-18 2019-01-29 Hj Laboratories Licensing, Llc Electronic device with an interactive pressure sensitive multi-touch display
US10496170B2 (en) 2010-02-16 2019-12-03 HJ Laboratories, LLC Vehicle computing system to provide feedback
EP2703954A3 (en) * 2012-09-03 2017-07-05 Samsung Electronics Co., Ltd Method and apparatus for extracting three-dimensional distance information from recognition target
CN103677255A (en) * 2012-09-03 2014-03-26 三星电子株式会社 Method and apparatusfor extracting three-dimensional distance information, terminals and gesture operation method
US9491418B2 (en) 2012-11-15 2016-11-08 Steen Svendstorp Iversen Method of providing a digitally represented visual instruction from a specialist to a user in need of said visual instruction, and a system therefor
WO2014076236A1 (en) * 2012-11-15 2014-05-22 Steen Svendstorp Iversen Method of providing a digitally represented visual instruction from a specialist to a user in need of said visual instruction, and a system therefor
EP2765502A1 (en) * 2013-02-08 2014-08-13 ShowMe Telepresence ApS Method of providing a digitally represented visual instruction from a specialist to a user in need of said visual instruction, and a system therefore

Also Published As

Publication number Publication date
FI20012231A (en) 2002-12-22
FI20012231A0 (en) 2001-11-16

Similar Documents

Publication Publication Date Title
CN210573659U (en) Computer system, head-mounted device, finger device, and electronic device
US11221730B2 (en) Input device for VR/AR applications
CN110647237B (en) Gesture-based content sharing in an artificial reality environment
CN105824431B (en) Message input device and method
EP2733574B1 (en) Controlling a graphical user interface
US10928929B2 (en) Method for displaying a virtual interaction on at least one screen and input device, system and method for a virtual application by means of a computing unit
JP6669069B2 (en) Detection device, detection method, control device, and control method
Loclair et al. PinchWatch: a wearable device for one-handed microinteractions
US11625103B2 (en) Integration of artificial reality interaction modes
US8570273B1 (en) Input device configured to control a computing device
KR20190094381A (en) Virtual User Input Controls in Mixed Reality Environment
WO2016189372A2 (en) Methods and apparatus for human centric "hyper ui for devices"architecture that could serve as an integration point with multiple target/endpoints (devices) and related methods/system with dynamic context aware gesture input towards a "modular" universal controller platform and input device virtualization
CN105264460A (en) Holographic object feedback
CN104272218A (en) Virtual hand based on combined data
US11223719B2 (en) Mobile communication terminals, their directional input units, and methods thereof
JP7135444B2 (en) Information processing device and program
WO2003003185A1 (en) System for establishing a user interface
WO2021073743A1 (en) Determining user input based on hand gestures and eye tracking
CN114115544B (en) Man-machine interaction method, three-dimensional display device and storage medium
US20240019938A1 (en) Systems for detecting gestures performed within activation-threshold distances of artificial-reality objects to cause operations at physical electronic devices, and methods of use thereof
WO2023157121A1 (en) Information processing device and information processing system
CN113849066A (en) Information barrier-free interaction device, system, method and storage medium
WO2023244851A1 (en) Systems for detecting in-air and surface gestures available for use in an artificial-reality environment using sensors at a wrist-wearable device, and methods of use thereof
Nguyen 3DTouch: Towards a Wearable 3D Input Device for 3D Applications

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG US UZ VN YU ZA ZM ZW

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ CZ DE DE DK DK DM DZ EC EE EE ES FI FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SK SL TJ TM TN TR TT TZ UA UG US UZ VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

ENP Entry into the national phase

Ref document number: 2004113454

Country of ref document: RU

Kind code of ref document: A

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP